Social media has been in the political and ethical spotlight over the past few weeks, with former President Trump’s Twitter account being suspended, social media apps like Parler being pulled from the App Store after being used to organize violence at the Capitol, and more.

Now, Apple is under fire from the non-profit Coalition for a Safer Web, which has filed a lawsuit to ask the company to ban messaging app Telegram, which has been found to host extremists on its public channels.

Though the app has banned several extremist groups from its service, the Coalition asserts that only total removal of the app will suffice, particularly as the app has become a refuge for white supremacists and other hate groups in the wake of other apps fading to black.

The conversation surrounding these apps opens a broader one about extremist content on social media channels and the internet, in general – most apps are centralized around core servers under the direction of the companies that own them, meaning rules around data privacy and more can change at any time.

On this MarketScale Industry Update, hosts Daniel Litwin and Tyler Kern tackle that overarching question, giving more context on the scope of the Telegram accusations, diving into the implications of greater surveillance, companies allowing this kind of content to exist, who should have say over what is and isn’t allowed on social media platforms, how we should analyze the role of private corporations in this dynamic, and more.

Follow us on social media for the latest updates in B2B!

Twitter – @MarketScale
Facebook – facebook.com/marketscale
LinkedIn – linkedin.com/company/marketscale