You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[please document what impact AI systems are having on the Web, how they're systemic (e.g. may change a significant set of expectations on the Web as a whole), with examples that illustrate this]
AI has led to software such as Glaze that is intended to change images so that they can't be read properly by machines. This is likely to be just the start, as creators push back against scraping and machine learning that takes their work and returns a product that has no benefit for original artists, writers, and creators. Nightshade is another tool in this genre.
On blocking, considerable numbers of news organisations block AI bots through robots.txt. But this is unlikely to be sufficient, as new ones surface constantly. Robots.txt has moreover never been sufficient for blocking many bots, and if it is regularly used by AI bots this will force more useful content behind paywalls and other mechanisms for keeping out bots.
There is a clear arms race occuring, as Turing tests have evolved in complexity to the extent that many people find them too difficult. For example, the Playstation Network series of games that it uses to establish that the user is indeed human. That is negative for disability access. GAI will increase the incentive for website owners to use such tests, making the web less usable.
More speculatively, web developers may be incentivised to mislabel content, including labelling AI generated content as human-generated and vice versa, or applying the wrong labels to images. This could happen as a result of the way search engines treat the content categories, or as a protest against AI, and it would have a negative impact on accessibility.
The text was updated successfully, but these errors were encountered:
[please document what impact AI systems are having on the Web, how they're systemic (e.g. may change a significant set of expectations on the Web as a whole), with examples that illustrate this]
AI has led to software such as Glaze that is intended to change images so that they can't be read properly by machines. This is likely to be just the start, as creators push back against scraping and machine learning that takes their work and returns a product that has no benefit for original artists, writers, and creators. Nightshade is another tool in this genre.
On blocking, considerable numbers of news organisations block AI bots through robots.txt. But this is unlikely to be sufficient, as new ones surface constantly. Robots.txt has moreover never been sufficient for blocking many bots, and if it is regularly used by AI bots this will force more useful content behind paywalls and other mechanisms for keeping out bots.
There is a clear arms race occuring, as Turing tests have evolved in complexity to the extent that many people find them too difficult. For example, the Playstation Network series of games that it uses to establish that the user is indeed human. That is negative for disability access. GAI will increase the incentive for website owners to use such tests, making the web less usable.
More speculatively, web developers may be incentivised to mislabel content, including labelling AI generated content as human-generated and vice versa, or applying the wrong labels to images. This could happen as a result of the way search engines treat the content categories, or as a protest against AI, and it would have a negative impact on accessibility.
The text was updated successfully, but these errors were encountered: