Deputy prime minister to urge UN general assembly to create international regulatory system
Even experts can’t keep up with all the developments :/
To add to that, in a lot of countries regulators can’t even keep up with basic technology
Turkey famously required a 3 day review before you could post a reply in an internet forum back in the 90s. They thought forums were like dueling letters to the editor.
I’d have liked to have seen what that would have turned out to be like :D
great voids of emptiness
But are they all dinossaur paper-based lawyers or just very focused on specific issues so they miss the bigger picture or are they actually advised by expert panels but meetings take too long…or nobody cares…or everyone is compromised by other interests…what?
Yeah what a bullshit title.
Regulators hardly know what the internet is yet.
“We’re incompetent at our jobs”.
Step down if you can’t keep up.
Honestly, an easy way to regulate generative A.I. is to just pretend the output was made by a person. If your “A.I.” is used to create a deepfake political ad, you should be fined or sued as if you had an intern make it. If you aren’t sure the LLM won’t hallucinate falsehoods, don’t use it for news articles unless you’re ok with libel laws being applied.
it’s kinda exactly the same as someone on the street handing you a bit of paper with a rumour on it and you publishing it without checking that it’s correct
Computers, the Internet, and the whole of IT have been moving too fast for regulators to keep up since the 90s. They are slower than a tortoise walking through molasses with a blindfold on.
But what can you expect when those who make regulations over IT still don’t know how to change the time on their VCR?
s/AI/Technology/
Regulators still don’t understand basic things about the internet, why are we surprised?
Have you tried being less old and understanding tech better?
That’s really not the differentiating factor.
Easily 80% of young people loudly commenting on the topic online have no idea about nuances as centrally relevant to where the tech is going as “Do Large Language Models learn world models or just surface statistics?”.
I see a ton of young people patting themselves on the back regurgitating what at this point is clear misinformation about stochastic parrots and remixing content that they picked up from similarly poorly informed tech writers with skin in the game, oblivious to the emerging picture in ongoing research.
It is moving too quickly.
Just today I was reading a paper on using CoT prompting (research from 2022) to efficiently transfer domain knowledge from a larger model to a much smaller model which then outperforms the original.
What that’s going to mean for Meta’s open sourced models, for the market for synthetic data, for the practical limitations on the impact of IP cases - is wild.
And that’s just this week’s news.
It’s way too much too quickly.
Keep in mind that the average practicing doctor is on average 17 years out of touch with the most recent research.
To expect a politician of any age to have a solid grasp on this stuff isn’t practical.
There are a number of trends in the research that can be reasonably predicted, but I’ve never seen a field moving this fast.
The very idea of trying to predict the situation even five years out is ludicrous. By the time legislation proposed today is being passed, it’s going to be obsolete.
Regulators are screwed.