In the era of web crawling, we had a file ‘robot.txt’ to prevent indexing a web page into their servers. Some of them do it anyway.
Now we are in the era of AI, it indexes the whole web and train on it. We need a similar solution like robot.txt
Comment if you agree. Freedom matters.