Jump to content
Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -

'Countering Digital Hate'


Eldorado

Recommended Posts

“The reason social media companies tolerate hate is for profit.

"We are not Facebook or Twitter customers, we are their product, and we are what they sell to advertisers”.

"He has one simple message for small or medium businesses which advertise on social media. “Suspend your adverts for a week, or a month, and tell them why you are doing it”.

"That, he says, is the only language the social media companies understand."

Full article at the Times of Israel: Link

"The organisation lobbies American "big tech" firms such as YouTube, Facebook, Amazon, Twitter, Instagram and Apple to "deplatform" individuals so that they cannot present their views to the general public. The CCDH's most high-profile campaign so far has been against prominent British conspiracy theorist David Icke. Other targets of CCDH campaigns have included the left-wing politician and broadcaster George Galloway and the right-wing media personality Katie Hopkins."

Wiki

  • Like 3
  • Haha 1
Link to comment
Share on other sites

 

My instinct is to say to limit censorship to the bare minimum. No you can't scream 'Fire' in a crowded theatre as a classic example.

My problem with 'digital hate' is firstly that it can only be subjectively judged as hate and the politics of the judger skews things. Secondly, isn't it a right in a free society to spew hatred that others are not forced to listen to?

Censorship in a free society should be kept to a minimum.

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

I'd hardly call it muzzling. 

I would say that no one owes anyone else a soap box to stand on. 

Take this site, for instance. 

It doesn't owe anyone a platform from which to spew hate. 

People on here have to follow the rules. 

  • Like 1
Link to comment
Share on other sites

But if certain platforms are the major or only way to engage politically (i.e, Trump on twitter) then banning someone from that platform is removing them from the political process. I'd say don't ban them at all, but prosecute people if they break the law. If saying mean things is not against the law then if you want to avoid the content there are mute options.

 

  • Like 4
  • Thanks 1
Link to comment
Share on other sites

The problem is, hate speech is powered by mob rule and few ever take responsibility for their words.  A message of hate spread by 1 person with a large following will have more damaging impact than saying their allegation face to face.  Online hate speech spreads like a disease.  Must be awful to be a kid growing up these days.  Their phone is probably their primary means of safety.  Being picked on by a few class mates face to face is one thing, but to have that symbol of safety and privacy being invaded and targeted daily by someone who makes up a false claim about them which then spreads across the school, the community, the planet, and all of the abuse lands right into that poor person's phone and every time they hear it chime a new message they dread with fear because of the constant intimidation they receive.  If the platforms that allow hate speech to be published do not take part of the responsibility then they are in essence part of the problem.

 

Edited by TigerBright19
  • Like 3
Link to comment
Share on other sites

37 minutes ago, TigerBright19 said:

The problem is, hate speech is powered by mob rule and few ever take responsibility for their words.  A message of hate spread by 1 person with a large following will have more damaging impact than saying their allegation face to face.  Online hate speech spreads like a disease.  Must be awful to be a kid growing up these days.  Their phone is probably their primary means of safety.  Being picked on by a few class mates face to face is one thing, but to have that symbol of safety and privacy being invaded and targeted daily by someone who makes up a false claim about them which then spreads across the school, the community, the planet, and all of the abuse lands right into that poor person's phone and every time they hear it chime a new message they dread with fear because of the constant intimidation they receive.  If the platforms that allow hate speech to be published do not take part of the responsibility then they are in essence part of the problem.

 

A couple of questions:

1) what is the definition of "hate speech" in the above? I don't know of any "accepted" definition and it seems to be different for individuals.

2) Is being offended by a comment enough to censor the person commenting? I agree with offensive comedian Ricky Gervais on this one "Just because you are offended doesnt mean you are right"

I know there are people in this world who are offended that women have the right to vote, should their offence be considered, or is it only if you offend the Karen's of the world that action should be taken?

 

In my opinion, it is a disservice to "protect" people from the evils of the world. There won't always be a big brother to look after you and you must learn to put on your big boy pants and accept the reality of the world.

  • Like 3
Link to comment
Share on other sites

5 hours ago, Hugh Mungus said:

In my opinion, it is a disservice to "protect" people from the evils of the world

I agree with everything you said except this^

When it comes to instances of these platforms being used to bully children, I really think we need serious repercussions.  Children commit suicide every year because of this and those who take part in it should be made to face the ugliness of what they do.

  • Like 4
  • Thanks 1
Link to comment
Share on other sites

22 hours ago, Hugh Mungus said:

A couple of questions:

1) what is the definition of "hate speech" in the above? I don't know of any "accepted" definition and it seems to be different for individuals.

2) Is being offended by a comment enough to censor the person commenting? I agree with offensive comedian Ricky Gervais on this one "Just because you are offended doesnt mean you are right"

I know there are people in this world who are offended that women have the right to vote, should their offence be considered, or is it only if you offend the Karen's of the world that action should be taken?

 

In my opinion, it is a disservice to "protect" people from the evils of the world. There won't always be a big brother to look after you and you must learn to put on your big boy pants and accept the reality of the world.

The form of online 'hate speech' that I think warrants deletion, retraction, or apology would be when someone makes up a defamatory comment or allegation against someone out of pure malice e.g.  Calling an innocent person a shop lifter, or a sex offender i.e.  Making an online video filled with outlandish lies about someone and allowing their followers to believe it and attack the innocent person online.  When 1,000 people are deceived to believe the false allegation and only a few of them hear the innocent party's side, the lies are heard louder than the truth.  I would say that warrants the moderators of the platform to punish the perpetrator and defend the innocent.  The problem is, how does the moderator know the innocent party is innocent or guilty without verifiable evidence,  The only way I think of them not taking sides is to make it known that they do not endorse anything said on their platform, but that will not comfort the victims of suicide who could do nothing to stop the online abuse because the platform refused to take any responsibility.  I suspect some relatives will start to demand financial compensation for mental health and funeral costs etc.  I know that if social media was state run then there would be high demand for compensation from the government and perhaps for legal fees to clear people's names of online defamation.

I think the best alternative would simply be to have all comments on social media have a limited lifespan because verbal insults only last a short time and are forgotten, but insults that are posted in online text are permanent and the negative mental effects do not go away, because the insults are stuck online like cement and always sound fresh to readers and the only way that person can try to put it out of their mind is to avoid social media altogether which is almost impossible these days.  Much better to wipe the slate clean each month.  Sometimes which the internet was like a magna doodle.  ^_^

 

 

Edited by TigerBright19
Link to comment
Share on other sites

On 8/5/2020 at 5:13 AM, papageorge1 said:

My instinct is to say to limit censorship to the bare minimum. No you can't scream 'Fire' in a crowded theatre as a classic example.

My problem with 'digital hate' is firstly that it can only be subjectively judged as hate and the politics of the judger skews things. Secondly, isn't it a right in a free society to spew hatred that others are not forced to listen to?

Censorship in a free society should be kept to a minimum.

I agree that, ideally, censorship in a free society should be kept to a minimum (with shouting 'Fire!' in a crowded theatre an example of the obvious need for some limit).

My main disagreement with you is the "not forced to listen to" bit.

Sure, if the hatred is spewed on TV or online then you can turn it off. But there are a number of circumstances where you can't switch it off - when you're being subjected personally to abuse, such as when you're travelling on a bus or train, or operating a business, or visiting an abortion clinic, or, as AndThen mentioned, subjecting children to this sort of abuse. These are cases when where you're trapped by your circumstances, and the abuser knows it and takes advantage of it.

It's made worse if the subject of the abuse is in a weak position compared with the perpetrator - such as belonging to a cultural or religious minority, having a disability, or being a child.

  • Thanks 1
Link to comment
Share on other sites

14 minutes ago, Peter B said:

I agree that, ideally, censorship in a free society should be kept to a minimum (with shouting 'Fire!' in a crowded theatre an example of the obvious need for some limit).

My main disagreement with you is the "not forced to listen to" bit.

Sure, if the hatred is spewed on TV or online then you can turn it off. But there are a number of circumstances where you can't switch it off - when you're being subjected personally to abuse, such as when you're travelling on a bus or train, or operating a business, or visiting an abortion clinic, or, as AndThen mentioned, subjecting children to this sort of abuse. These are cases when where you're trapped by your circumstances, and the abuser knows it and takes advantage of it.

It's made worse if the subject of the abuse is in a weak position compared with the perpetrator - such as belonging to a cultural or religious minority, having a disability, or being a child.

I don’t like rudeness but I dislike censorship more,

Link to comment
Share on other sites

 
6 hours ago, Peter B said:

Not wanting to trap or trick you, but do you think the incidents in these stories should be protected from legal punishment?

Chanting sexist lyrics

 

The school has a right to suspend but no police legal action beyond disturbing the peace or whatever. But the OP was about 'digital hate'. I would prefer to see sexist lyrics and racially provocative words allowed on social media as opposed to censorship as they can be easily dismissed.

Link to comment
Share on other sites

If you edit what people can say, that makes you a publisher. Publishers are legally responsible for whatever they publish, not just hate speech but any information or images. For example, if someone were to arrange a drug deal on Twitter or Instagram, and those companies were legally considered publishers, they would be responsible. The same of all the "nudes" kids send to each other. (I teach at a high school and there are always kids getting in trouble for sending and sharing what is legally considered *** Blocked ***.) So Facebook, Twitter, etc. don't want to be legally seen as publishers. To be a platform in the legal sense, they can't be seen as choosing what gets put out except for legal or safety reasons. Obviously, they do this anyway, but the desire to remain a platform as opposed to a publisher is the main factor preserving any semblance of free speech on these social networks. That's what all the Congressional hearings boil down to; do you censor political speech? If so, you're a publisher and probably have already broken the law numerous times based on the content published on your site. If you want to avoid that liability, allow all sides to express themselves.

  • Thanks 1
Link to comment
Share on other sites

8 hours ago, C L Palmer said:

If you edit what people can say, that makes you a publisher. Publishers are legally responsible for whatever they publish, not just hate speech but any information or images.

EXACTLY!  The Communications Decency Act shields them when they are just "hosting" content.  If they get into the business of picking and choosing and it predominately impacts ONE ideology over another then they need to lose that protection.  Section 230 of that act needs to be revisited if these platforms don't stop censoring for obviously political aims.  As it is now, if I, for example, made a threat against some politician then I'd be subject to a visit from the fearless minions of the EFFA BEE EYE.  The site that hosted those words can't be sued at least that's MY understanding.  If I'm wrong I'll stand to be corrected.

Link to comment
Share on other sites

We are always free to say whatever we wish... though never free from the consequences of it.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.