Toxicity Checker on OnlineCourseHost.com

The toxicity checker on the platform has been designed to monitor the courses that are published and sold on the platform, to prevent the spread of unsafe or illegal content.

To protect the platform and all the creators that use it, our toxic checker will systematically validate all content.

Whenever a course creator tries to publish a course on their website, the toxicity checker will go through the course content, and scan it to ensure it is safe for publication.

After publication, the content is also continuously checked with each lesson update, to ensure it remains valid.

Here are some examples of toxic content. Notice that this is not a complete list, these are just typical examples:

  • promotion of illegal activities 
  • stolen content
  • pseudo-science or misinformation 
  • make money fast content
  • hacking (white hat or not)
  • promotion of self-harm
  • black-hat SEO practices 
  • pornography 

The toxicity checker will classify the courses into two categories:

Non-Toxic

This serves to demonstrate that the course is secure and can be made public once the course creator has finished the required checklists for publishing a course.

Toxic

This status indicates that the course is not safe for publication. Consequently, the course will not be published, and the Helpdesk team will conduct a manual review of the course and provide feedback in due course:

What happens if my content is considered toxic, even after review?

If the support team reviews your course content and considers it toxic just like the checker did, the course will not be published on the platform.

If after several reviews the content is still considered toxic, your account could potentially get canceled and the website deleted, but it's very rare that it ever gets to that.

The toxic content checker is there to protect not only the platform but yourself as well, as you might not even realize that a certain type of content could be problematic.

Where can I check my toxicity status?

To review your toxicity status, you can navigate to Admin => Settings => General => Toxicity Status => View Strikes.

At this location, you can assess the number of strikes incurred within the past 6 months, encompassing both previous and current infractions, as well as identify the course(s) associated with each strike.

What if my course is flagged by the toxicity checker?

Once a course is determined to be toxic, the following actions are taken:

  1. The course will be unpublished if it contains toxic content, and a strike will be issued.
  2. If the course is still in draft, it will be reviewed before publication. If toxic content is found, it will not be published until the toxic content has been removed. In this case, a strike will not be issued.
  3. A strike will be issued to the course creator if the course is published and later edited to include toxic content.

  4. The course creator will be notified via email if their course(s) have been unpublished due to the presence of toxic content.

It is important to note that our policy permits a maximum of 3 strikes within a 6-month timeframe. Upon the third strike, an automated 30-day removal process will be initiated, after which the entire account will be permanently deleted from the platform.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us