Artificial intelligence trained on the banned website 4chan turned into a “hate machine”

2 mins read
Artificial intelligence trained on the banned website 4chan turned into a "hate machine"

AI researcher and YouTuber Yannic Kilcher trained an AI language algorithm with data from a notorious website

Artificial intelligence trained on the banned website 4chan turned into a "hate machine"

Kilcher used 3.3 million posts from 4chan, a social media app for sharing anime and memes that is currently blocked in many countries.

The messages Kilcher used came from the /pol/ extension, which contains some of the most offensive content on the platform.

In the end, the algorithm turned into a “hate speech machine”, generating highly offensive content.

“The language model was horrible,” Kilcher said:

“It perfectly summarized the aggression, nihilism, trolling and suspicion that permeated most posts on /pol/.

The YouTuber then created 9 bot accounts from the language algorithm. The bot accounts posted about 15,000 times in 24 hours.

Meanwhile, AI researchers saw Kilcher’s activity as an “unethical experiment”.

Lauren Oakden-Rayner, senior research fellow at the Australian Institute for Machine Learning, said, “his experiment would never pass a human research #ethics board,” 

“Medical research has a strong ethical culture because we have a terrible history of harming people. This experiment violates every principle of human research ethics.

On the other hand, Kilcher said that he did not intend this as an experiment and that users already share very offensive content on the social media platform in question.

Kilcher, who shared his observations in a video posted on his YouTube channel with the note “The worst artificial intelligence you will ever see”, named this language algorithm “GPT-4chan”.

With this name, Kilcher was referring to GPT-3, the famous language algorithm of the artificial intelligence firm Open AI.

GPT-3 was famous for its ability to design websites, answer questions, write articles and prescriptions.

VICE

FİKRİKADİM

The ancient idea tries to provide the most accurate information to its readers in all the content it publishes.


Fatal error: Uncaught TypeError: fclose(): Argument #1 ($stream) must be of type resource, bool given in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php:2386 Stack trace: #0 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2386): fclose(false) #1 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2146): wp_cache_get_ob('<!DOCTYPE html>...') #2 [internal function]: wp_cache_ob_callback('<!DOCTYPE html>...', 9) #3 /home/fikrikadim/public_html/wp-includes/functions.php(5420): ob_end_flush() #4 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(324): wp_ob_end_flush_all('') #5 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(348): WP_Hook->apply_filters('', Array) #6 /home/fikrikadim/public_html/wp-includes/plugin.php(517): WP_Hook->do_action(Array) #7 /home/fikrikadim/public_html/wp-includes/load.php(1270): do_action('shutdown') #8 [internal function]: shutdown_action_hook() #9 {main} thrown in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php on line 2386