Keeping the workspace safe with
Content Reporting in Google Chat

Keeping the workspace safe with
Content Reporting in Google Chat

Keeping the workspace safe with
Content Reporting in Google Chat

Google / User Experience Design / 2021-2023

Google
User Experience Design
2021-2023

What was the problem?

As the adoption rate of chat platforms at work has increased over the past years, the number of incidents due to inappropriate messages and attachments has increased as well. During our client feedback sessions, we continuously received requests for the ability to better manage and moderate the chat environments.

Message and attachment examples included

  • Offensive language, which created an uncomfortable work atmosphere

  • Images and videos, which were inappropriate for the workspace

  • Sensitive personal information, which should be kept private

  • Confidential company data, which should be kept secure internally

Example of offensive language and sensitive information

How did we solve the problem?

We created a content reporting feature for the users in Google Chat, as well as an Admin Console tool for admins to review and take actions on the reported content.

End users in Google Chat
When users find a message or attachment inappropriate, they can use the report feature to flag the content to admins. They will be required to select a reason for reporting and will have the option to add an explanation.


Admins in Admin Console
Once a user flags a message, admins will be notified and will be able to find more details about the incident in the Security Investigation Tool. Based on the information provided, they will be able to decide whether further action is required or not.

If further action is required, admins will be able to

  • Delete the message

  • Remove the offender from chat room

  • Close and delete the chat room

Designed workflow for content reporting

Solution video

Content reporting workflow video

Design process

Client Feedback

During our product feedback sessions, we received numerous complaints about the growing number of issues related to inappropriate content in Google Chat.

UX PM Sessions

I held multiple brainstorming sessions with our PM to identify client needs and establish requirements.

  • Report: A method for Chat users to report inappropriate content

  • Alert: A system for admins to receive notifications when an incident occurs

  • Search: Functionality for admins to easily locate and review incident details

  • Triage: Tools for admins to assess situations and take appropriate actions to address them

Notes from UX PM ideation session

Cross-Product Collaboration

This project involved collaboration between two distinct product teams. We frequently held sync meetings with the Google Chat team to define the scope of responsibilities for each team and ensure seamless integration between our solutions.

  • Google Chat: Responsible for the ‘Report’ feature

  • Admin Console: Responsible for ‘Alert,’ ‘Search,’ and ‘Triage’ features

Work division between Google Chat and Admin Console teams

Design Question 1

Where can we build these new features on top of?

In an ideal scenario, we would always have the capacity to develop a dedicated tool for each specific need. However, real-world constraints such as time and resource limitations often make this impractical. In this instance, it was determined that creating an entirely new tool for these features wasn't feasible. Therefore, we needed to identify a tool within the Admin Console that we could build upon.

Best Fit: Security Investigation Tool


Learn more about Security Investigation Tool

After several discussions among the UX, PM, and Engineers, we collectively decided to build our features on top of the SIT for two main reasons:

  • SIT was already receiving and displaying data logs from Google Workspace products.

  • SIT already possessed search capabilities with the condition builder, enabling admins to search for specific incidents, such as 'Reported content.'

SIT search results with the condition “Event is content reported.”

Addressing the Limitations

However, despite its advantages, SIT was not originally designed as a tool dedicated to content reporting and had its limitations. The data log format couldn’t accommodate all the information we wished to display. To address this, we needed more space to effectively present our information and incorporate a user interface for action-taking. Therefore, I designed the Message ID and Conversation ID cells to serve as links that open a side panel. Leveraging the additional space provided by the side panel, we were able to include the following:

  • Information about the reported message

  • Transcript of chat surrounding the reported message

  • Details about the chat room

  • Information about the members in the chat room

  • User interface elements for action-taking on each of the above aspects

    • Delete message

    • Remove user from chat room

    • Delete chat room

Side panel interaction was added to provide an interface with more details and actions

An Operating (but Incomplete) Workflow

At this point, we have designed an operational workflow, fulfilling the Search and Triage aspects of our requirements. However, the admins still lacked a notification system for incident occurrences, requiring them to constantly monitor the Admin Console.

Admin workflow for searching and triaging reported content

Design Question 2

Which existing tools can we leverage?

When designing within an established large-scale product, understanding when and how to leverage existing components is a crucial design skill. In this instance, the Admin Console already featured a tool known as the Alert Center, equipped with a built-in notification system. Recognizing this, I determined that developing a new notification system for content reporting was unnecessary. Instead, I explored ways to integrate our requirements with the existing Alert Center.

Utilizing Activity Rules

To notify admins when content is reported, I decided to utilize Activity Rules. An activity rule is a mechanism within the Alert Center that triggers a predefined action when conditions, set by admins, are met. For our purpose, I configured the condition as 'Event Is Reported content' and set the action to 'Send email notification.' Furthermore, I established this as a system default rule, ensuring automatic implementation for all clients without requiring additional setup.

Activity rule being created to notify admins, when triggered by reported content

Workflow Completed!

With the final addition of the Activity Rule, we were able to design a single workflow, which seamlessly connected the end user’s experience to the admin’s experience.

Project status and outcome

Going through a round of Beta launch, we were able to receive positive feedback and proceed to GA Launch!

Content Reporting - Gradual rollout starting on May 25, 2023
Managing Chat Spaces - Gradual rollout starting on May 8, 2023

My role and collaboration

There were two sides to this project. One was the Admin Console side, where admins needed the ability to review reported content and take actions accordingly. I owned the end-to-end UX of this project by

  • Collaborating with PM to set requirements

  • Leading ideation sessions and team discussions on UX

  • Designing user flows and interfaces

  • Communicating with x-functional partners for team alignment

The other side was from Google Chat, enabling the end user to report inappropriate content. This was handled by the Google Chat team.

I mainly collaborated with 2 Product Managers, 2 Writers and 4-6 Engineers within the Admin Console team. I also had collaboration sessions with the Google Chat team in order to keep expectations aligned at all times.

For more details about the project, please contact jtk8682@gmail.com

© 2025 John Taikhyung Kim

© 2025 John Taikhyung Kim

© 2025 John Taikhyung Kim