Facebook is the social media platform we all love to hate. We have good reasons for the dislike.
Let's start with data mining. Some 80 percent of consumers express concerns about Facebook's use of personal data. Every click is recorded, and without robust privacy tools, cookies follow you off the platform and into the wider world.
Marketing teams love that data collection. We use psychographic profiles to develop sophisticated ad campaigns that work.
But if you get a sick, slimy feeling from profiting from Facebook, you're not alone. Executives are slow to moderate hate speech (as much as they claim this isn't the case), and some companies see their ads next to awful content they don't want associated with the brand.
Finally, many consumers are plain disgusted with the way others behave on Facebook. (Even teenagers notice the trend, and they're not known for their kindness as a cohort.) People feel free to say and do things on Facebook they'd never attempt in the real world.
As a small business owner, you can't change Facebook's data collection policies, and you can't make Zuckerberg change his worldview. You can join the Facebook boycott (although it's not clear that withholding dollars impacts the company's profit). Publish your content as usual, and don't spend a dime to promote it. Organic content still works.
But let's take this a step further.
All organizations have a responsibility to create safe, creative community spaces. We protect our brands when we put distance between our work and hate speech. If you're publishing content on Facebook, you have a responsibility to keep nastiness away.
Start by crafting community standards.
Think of them as rules of engagement for your community. Outline:
Your values. For some companies, a mission statement also contains value statements. Words like transparency, kindness, and honesty all express your conduct expectations.
Your triggers. Violence, racism, threats, profanity, and doxxing should all catch your attention. Other issues, including stolen content, might appear on this list.
Your process. Facebook's robust tools, including filters, can detect some types of banned content. Humans do the rest. Explain your process in detail.
The consequences. Do you skip right to bans, or do you hand out warnings first? Do you hide content or delete it?
It takes time to draft exceptional community standards. If your company doesn't have robust mission statements or value positions, the work takes longer.
But when you're done, you've put your community on notice about the behavior you will and won't tolerate. Then, enforcement begins.
Do this successfully, and your page is transformed. The cesspool of attacks and threats dries up, and blossoms of thoughtful discussion appear. Companies that have moved through this successfully see an uptick in community engagement.
We're living through a stressful time, and plenty of people use social media tools to vent aggression and frustration. Right now, people like this dominate conversations.
Move them away from your page, and you encourage a space filled with kindness, calmness, and meaningful connection. People in your community looking for a bright spot of humanity will come to your page for their fix. And they'll thank you for the opportunity.
Plenty of how-to writing guidelines exist. But if you're overwhelmed at the thought of writing, don't be. An outsider (including freelancers like me) can draft robust guidelines you can tweak for your community. If you need help, reach out!
Σχόλια