Written by 6:39 am AI Device, Latest news

How spooky is NUCA? This AI camera you create photoshopped images of people by removing their clothing.

With deepfake photos alarming people, the usefulness of AI is overshadowed by its danger like in th…

As artificial intelligence (AI) is evolving, we expand our understanding of its capabilities and how it might be applied in our daily lives. Some AI merchandise, however, have the potential to cause more harm than good.

For instance, the Artificial camera made by two European artists dubbed “NUCA” can immediately remove a woman’s clothing or any other attire in real time. Although it serves as a sort of secret editing tool for creatives, this might reveal the AI’s potential darker side.

What Is NUCA Accomplish?

This AI Camera Can Create Deepfake Images of People By Removing Their Clothing: How Scary is NUCA?(Photo: Benedikt Groß)
The effectiveness of AI is overshadowed by its risk, such as the NUCA camera’s ability to quickly remove the subject’s clothing in the case of photoshopped photos, which alarms people.

The intention behind the development of NUCA by mathematicians Mathias Vef and Benedikt Groß was to highlight the potential abuse of AI systems in terms of personal privacy. The handset is used to capture pictures, which are then processed by AI techniques in the sky, by the lens, which was created using 3D style tools.

According to FoxNews, the Artificial reproduces what it predicts a person’s nude body may look like based on data about their identity, experience, time, and body condition. The ease and speed with which this can be accomplished is what’s most concerning: NUCA only requires little technical expertise and completes its operation in about 10 seconds.

The Rise of Deepfake Technologies

While algorithmic pictures, mainly nude photos of celebrities, has been existing for quite some time on child websites, NUCA is a significant milestone in terms of convenience and efficiency.

Traditional methods for making fake nudes call for skilled editing and a lot of time. In contrast, NUCA democratizes this ability, reducing the time and skill barrier to just seconds and a simple operation, thus amplifying the potential for misuse.

NUCA Exposed to Misuse and Ethical Problems

The potential for harm is the main concern with technologies like NUCA. Without requiring consent to digitally “undress” someone, the technology easily lends itself to malicious uses such as blackmail or cyberbullying. The artists hope that the public will start to debate the ethical nature of AI development by exposing this capability. However, while they do not intend for NUCA to be used commercially, the technology it represents could be replicated and used unethically by others.

AI’s Impact on Society

These technologies have a lot of wider implications. As AI’s capabilities expand, so does their ability to defy social norms and ensure one’s security.

Deepfakes are becoming increasingly realistic, blurring the lines between truth and digital fabrication. This development poses a risk of complicating social and legal frameworks, making it difficult to tell apart between real and fake content.

It even reached the point where pedophiles relied on AI to create fake children’s nudes for extortion.

Brighter Side of AI Cameras

So much for the commotion caused by AI cameras producing fake images; they serve a purpose for why they were first made in the first place.

For instance, a Canadian telecom company uses an artificial intelligence camera to prevent and monitor wildfires brought on by climate change.

Even the French police are currently looking into using AI cameras to identify any risky activities that might prevent the 2024 Paris Olympics.

Visited 2 times, 1 visit(s) today
Tags: , Last modified: May 2, 2024
Close Search Window
Close