Do you know what’s being shared across your organization’s internal communications channels? Chances are, the answer is no — especially in today’s remote-first working culture where it’s impossible to observe your workforce’s communications practices in person. Lax or undefined policies for conduct and communications standards can leave the door open for sharing the wrong things at work. Data loss prevention (DLP) can help solve the content moderation problem with automated scanning of SaaS environments to identify improper messages before they’re shared.
What is content moderation?
Content moderation is a group of policies that covers what should and shouldn’t be shared on company systems. Things like profanity, toxicity, and harassment are commonly covered by content moderation policies, as well as a Code of Conduct and Acceptable Use Policy. Per a CNBC report, Google has recently seen a rise in posts flagged for racism or abuse on its message boards. The company is asking employees to take a more active role in moderating internal message boards. That’s one way to handle content moderation, but it’s also asking a lot of employees to divert their time and attention away from higher value tasks at work. An automated content moderation system supported by DLP can cover this ask and do it much better than a human workforce.
How does Data Loss Prevention apply to content moderation?
Content moderation is a nebulous ask for a security team to take on. Without an automated platform that can scan the apps your workforce uses, and without clearly defined rules about which kinds of content should (or should not) be shared on company cloud workspaces, it’s hard to get measurable results. DLP allows you to customize scans to find the terms and files that do not belong on your company’s cloud apps and block those messages from being shared across the company.
How does Nightfall help with content moderation?
Tracking down improper messages or files that shouldn’t be shared across your SaaS apps can be difficult and the added task drains precious time and resources that should be spent on higher value projects. Adding Nightfall DLP to your cloud security tech stack allows you to set custom rules to sniff out the roots of improper content that shouldn’t be shared at work. From there, you can delete the messages in question and send notifications to educate users why this type of conduct at work is not allowed. You can expand your remote work culture with confidence that all communications will not include harmful or otherwise inappropriate content.
What does Nightfall detect that’s relevant to content moderation?
Nightfall comes with over 100+ out of the box detectors to detect and classify data in SaaS systems. You can also set custom keywords to find terms that are inappropriate for work, and scan for messages or files that contain those keywords. Safeguard your employees’ and customers’ privacy by scanning for personal information that should be kept off cloud networks, like home addresses, phone numbers, vehicle ID numbers, and more.
Aaron’s maintains company communication standards on Slack with Nightfall
With thousands of employees spread across multiple time zones, the team at Aaron’s relies on Slack as their primary communication channel. Stuart Lane, Aaron’s Information Security Engineer for Application Security, deployed Nightfall DLP for Slack to monitor the team’s Slack content in real time. Stuart created custom rules that scan Slack messages & files for inappropriate content. Nightfall flags any potentially sensitive information in Aaron’s Slack instance and gives the IT team the option to manually quarantine the information or use automated workflows to enforce a security policy. “The ability to customize our notification messages allows us to reinforce our Code of Conduct and Acceptable Use Policy,” Stuart says.
How do I learn more about Nightfall?
The Nightfall blog contains news and information about cloud security, DLP, and Nightfall products to help infosec leaders level up their orgs’ security posture. Find more information about Nightfall and how your org can use DLP in these posts from our blog:
- Read how adding Nightfall DLP for Slack gives your org the power to monitor all Slack messages for inappropriate content: https://nightfall.ai/resources/4-things-newly-remote-companies-need-to-know-about-sharing-data-over-slack/
- Learn more about how DLP can improve your content moderation and address toxic behavior on Slack like bullying or harassment: https://nightfall.ai/resources/bullying-harassment-in-slack/
- Find best practices for maintaining remote workspaces: https://nightfall.ai/resources/3-security-best-practices-for-organizations-managing-a-remote-workforce/