• While Stormtrack has discontinued its hosting of SpotterNetwork support on the forums, keep in mind that support for SpotterNetwork issues is available by emailing [email protected].

AI Bots Going After High End and Unusual Images

Warren Faidley

Supporter
Joined
May 7, 2006
Messages
2,716
Location
Mos Isley Space Port
Had both by homepages and X social media account overwhelmed by bots over the last couple of weeks, right after I started posting comments and images of monsoon-related topics. Turns out, AI bots from China and Russia are often triggered to infest an account once they notice unusual images. This is done for AI image learning.

Just another way our work is being hijacked.

At some point, it's just not going to be worth the time and effort to photograph anything, knowing it will be molested and copied by AI.
 
There is an indication that AI companies, at least in Berne Convention countries, will lose all of the lawsuits against training their systems on copyrighted images. The training requires copying an image and storing that copy on the company's servers, an act that is a slam-dunk fail on 3 of the 4 factors of fair use (it's commercial, uses the entire work and the resulting output is a replacement for the original in the market).

The other issue is that AI output can have no copyright protection, meaning it is worthless to anyone wanting to use it for licensing, trademark or any other IP considerations. The only revenue AI companies can generate is subscription fees from its users.

Of course, copyright means nothing in China and Russia who are already ripping IP on a global scale.

The main issue for chasers is that by posting prized images and videos to any of the social media platforms, their new TOS's grant them permission to use that content for their own AI system training. Some of them have agreements to share data with outside AI companies. That means you have no recourse if their AI ends up replacing your own work.

A counter-measure is probably to avoid posting large high-res copies of images as those would likely not be as useful in training.
 
The other issue is that AI output can have no copyright protection, meaning it is worthless to anyone wanting to use it for licensing, trademark or any other IP considerations. The only revenue AI companies can generate is subscription fees from its users.
It's a sad state of affairs for human creators.

Copyright protection is probably not even relevant when the cost to challenge and prove a single claim is in many cases massive. For AI used as part of a larger work (video, or still image with only bits of AI) copyright is permitted because those works may have a human creator. A lot of AI is being used partially like that, so the damage is still done.

On the flip side of looking to copyright to protect us, false copyright claims are used as outright warfare or even to game systems and take money away from creators that is owed them. Example: false copyright strikes are filed frequently to take down videos/audio and hurt creators all the time on many huge platforms, either for subjective political correctness, from competition, or by large media groups playing licensing games with royalty free audio, etc.

Unfortunately, AI has nearly infinite revenue streams, not just people who pay a subscription for an LLM. It indirectly gains incredible revenue by dominating social media, mass media, and increasingly music and art. The revenue comes from ad dollars with the crappy flood of fake content, not from the AI itself and not always back to the AI directly but to AI 'creators' using it to drown us all in their trash. The general public loves AI content and cannot tell the difference between AI and reality so often it is frightening. It isn't even that good yet, but it will be.

Photography in my opinion was already ruined by extreme edits. It is common for many of the most popular landscape photographers to take images that are super boring and blend in exciting skies for example, and then crank up every edit slider to 11. They call it photography but is digital art. Digital art already killed traditional photography by signal to noise because the masses of simple audiences want that extreme wow factor in images even if they look super fake. Having watched that unfold over ten years, I think AI has a good chance of killing all art.

Unfortunately there are legions of people who are obsessed with it because they associate what they can 'make' by telling a computer to make it as an achievement and something they should be noticed for.

I hope you are right about the lawsuits, but nothing will stop AI I am afraid. I fully expect it to be worse for society than social media. I am not all one sided either, it has some legit uses, but like everything else developed by soulless Silicon valley bros, it is not treated with the maturity and careful deployment it badly needs.

The best idea I have for people who used to make good money on stills and video and who now may not be able to, is to adapt and produce on many platforms at once, or find other means of profit and enjoyment. For myself, I already mostly gave up on selling prints or timelapse from my landscape and night photography because for now I refuse to play social media algorithm games to get any of it noticed.
 
Back
Top