Artificial intelligence girlfriends “collecting creepy personal data”

Romantic AI chatbots are violating users' privacy in "disturbing new ways," researchers suggest

3 mins read
1
AI girlfriends like Replika's have become increasingly popular with the rise of productive AI chatbots that speak in human voices (Replika)

Popular artificial intelligence collects “creepy” information about girlfriends and boyfriends and does not meet basic privacy standards, according to a new study.

None of the 11 AI chatbots reviewed by researchers at the Mozilla Foundation (including Replika and Eva AI) met the organization’s security requirements. This put them “on par with the worst product categories” the organization has ever reviewed for privacy.

AI chatbots that offer users a romantic relationship have seen tremendous growth in the last year. There are more than 3 billion search results for “AI girlfriend” on Google. The popularity of these robots follows the introduction of advanced generative AI models like ChatGPT, which can provide human-like responses.

Mozilla noted that there are several “red flags” when it comes to popular chatbots, such as not encrypting personal information to meet minimum security standards.

“To be perfectly clear, AI girlfriends are not your friends,” said Misha Rykov, a researcher at Mozilla’s Privacy Not Included project.

Although they are marketed as something that will improve your mental health and well-being, they specialize in delivering addiction, loneliness and toxicity while taking as much data from you as possible.

The research was detailed today in a blog post published to coincide with Valentine’s Day, warning that romantic AI chatbots are invading users’ privacy in “disturbing new ways”.

The report on Eva AI Chat Bot & Soulmate, which costs about $17 per month, noted that the chatbot has a good privacy policy but is still “offensive” to personal information.

“The Eva AI chatbot feels pretty creepy in that it actually forces users to share tons of personal information, even though its privacy policies seem to be some of the best we’ve reviewed,” the Mozilla Foundation said in a blog post on its website.

The fact that their privacy policy already says that they do not share or sell this information does not mean that this policy will not change in the future.

The Independent has reached out to Eva AI and Replika for comment but has not yet received a response.

The researchers advised AI chatbot users not to share sensitive information with these apps and to request data deletion when they no longer use the app.

Users are also advised not to grant continuous geolocation tracking to AI chatbot apps and not to give them access to the device’s photos, videos or cameras.

source: https://www.independent.co.uk/tech

FİKRİKADİM

The ancient idea tries to provide the most accurate information to its readers in all the content it publishes.


Fatal error: Uncaught TypeError: fclose(): Argument #1 ($stream) must be of type resource, bool given in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php:2381 Stack trace: #0 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2381): fclose(false) #1 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2141): wp_cache_get_ob('<!DOCTYPE html>...') #2 [internal function]: wp_cache_ob_callback('<!DOCTYPE html>...', 9) #3 /home/fikrikadim/public_html/wp-includes/functions.php(5420): ob_end_flush() #4 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(324): wp_ob_end_flush_all('') #5 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(348): WP_Hook->apply_filters('', Array) #6 /home/fikrikadim/public_html/wp-includes/plugin.php(517): WP_Hook->do_action(Array) #7 /home/fikrikadim/public_html/wp-includes/load.php(1270): do_action('shutdown') #8 [internal function]: shutdown_action_hook() #9 {main} thrown in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php on line 2381