Facebook will pay Content Moderators Working from Home. 

Facebook's content moderators in the US working as the third party will get paid for working from home. 

Facebook will pay Content Moderators Working from Home. 
Facebook will pay Content Moderators Working from Home. 

Facebook will pay Content Moderators Working from Home. 

Facebook's chief Mark Zuckerberg confirms that it will pay its third-party US content moderators working from home. He tells the reporters who contacted him on a call. Even if the third-party content moderators can not complete their regular tasks, they will get their full salaries. 

Coronavirus compels most of the workers of the tech giants to work from home. Giant companies like Google and Twitter urges employees to work from home due to the outbreak. Facebook is also not an exception in this regard. The company looks forward to using artificial intelligence on a broad scale for moderating content as long as the coronavirus crisis continues. 

Zuckerberg confirms that the work-from-home policy will continue until sufficient public health responses are available. 

But the initiatives taken by the tech giant did not go far enough according to a workers' rights group. 

Joe Rivano Barros, the Workers Agency's manager, says, "It's great that they are letting them work from home, but it seems like the bare minimum Facebook could do." 

He also points out that the contract workers are deprived of bonus payments that the directly employed employees get. 

The number of third-party hired content moderators working for Facebook in the US is nearly 15,000.  

Their job is to find out whether posts by the firm's own software or by the users are harmful and inappropriate or not. 

If the workers work from home instead of working from the base, some data can not be shared with the contractors for privacy issues. Zuckerberg confirms this fact but does not make specifications in detail. 

As a result of this lacking, there will be some contents that slip out, but Zuckerberg acknowledges that the authority will delete these contents.  

Facebook also confirms that sensitive contents that involve suicide or self-harm will get moderation from the full-time staff as the contract moderators did not have enough capacity to support the mental health situation that those posts will create. 

As many people are bound to stay at home for self-isolation, they will be depressed, and they will get encouraged to create this type of content. 

He says, "I am personally quite worried that the isolation from people being at home could potentially lead to more depression or mental health issues, and I want to make sure that we are ahead of that supporting our community."

It is a convenient kit to auto-detect any such type of content and remove them automatically. Facebook has been working on this issue for many years. 

But earlier this week, the company confirms that the scheme failed as it has limitations. The limitations are evident when we see a considerable amount of legitimate posts got removed from view because they were flagged wrongly. It might have happened because of a spam-filter bug.

According to Rivano Barros, the use of artificial intelligence instead of contract workers will increase the company's dependency on the bots, and ultimately they will replace human contract moderators.  

There has also been an announcement by the tech giant, Facebook, that it will create a $100m (£86m) grant program so that small businesses can get support from it to cover up the loss as a result of the viral outbreak. 

Besides, the full-time employees of this company are getting a $1,000 bonus to themselves in this crucial time.