I don’t understand why browsers support this “functionality”.
I don’t understand why browsers support this “functionality”.
But who can stop an oven with a gun?
Why do people do that? I mean, if they intend to abandon the dog poop, why would they bag it first?
always chooses the cheapest option
quality isn’t great
Capitalism is to blame!
My naming convention for C++ is that custom types are capitalized and instances aren’t. So I might write User user;
.
So far “more data” has been the solution to most problems, but I don’t think we’re close to the limit of how much useful information can be learned from the data even if we’re close to the limit of how much data is available. Look at the AIs that can’t draw hands. There are already many pictures of hands from every angle in their training data. Maybe just having ten times as many pictures of hands would solve the problem, but I’m confident that if that was not possible then doing more with the existing pictures would also work.* Algorithm design just needs some time to catch up.
*I know that the data that is running out is text data. This is just an analogy.
What occasions are you referring to? I know people claim that Israeli use of white phosphorous munitions is illegal, but the law is actually quite specific about what an incendiary weapon is. Incendiary effects caused by weapons that were not designed with the specific purpose of causing incendiary effects are not prohibited. (As far as I can tell, even the deliberate use of such weapons in order to cause incendiary effects is allowed.) This is extremely permissive, because no reasonable country would actually agree not to use a weapon that it considered effective. Something like the firebombing of Dresden is banned, but little else.
Incendiary weapons do not include:
(i) Munitions which may have incidental incendiary effects, such as illuminants, tracers, smoke or signalling systems;
(ii) Munitions designed to combine penetration, blast or fragmentation effects with an additional incendiary effect, such as armour-piercing projectiles, fragmentation shells, explosive bombs and similar combined-effects munitions in which the incendiary effect is not specifically designed to cause burn injury to persons, but to be used against military objectives, such as armoured vehicles, aircraft and installations or facilities.
The issue I have with referring to the current situation as a bubble is that this isn’t just hype. The technology really is amazing, and far better than what people had been expecting. I do think that most current attempts to commercialize it are premature, but there’s such a big first-mover advantage that it makes sense to keep losing money on attempts that are too early in order to succeed as soon as it is possible to do so.
Multiple studies are showing that training on data contaminated with LLM output makes LLMs worse, but there’s no inherent reason why LLMs must be trained on this data. As you say, people are aware of it and they’re going to be avoiding it. At the very least, they will compare the newly trained LLM to their best existing one and if the new one is worse, they won’t switch over. The era of being able to download the entire internet (so to speak) is over but this means that AI will be getting better more slowly, not that it will be getting worse.
I don’t disagree, but before the recent breakthroughs I would have said that AI is like fusion power in the sense that it has been 50 years away for 50 years. If the current approach doesn’t get us there, who knows how long it will take to discover one that does?
It would be odd if AI somehow got worse. I mean, wouldn’t they just revert to a backup?
Anyway, I think (1) is extremely unlikely but I would add (3) the existing algorithms are fundamentally insufficient for AGI no matter how much they’re scaled up. A breakthrough is necessary which may not happen for a long time.
I think (3) is true but I also thought that the existing algorithms were fundamentally insufficient for getting to where we are now, and I was wrong. It turns out that they did just need to be scaled up…
It’s funny - I live in a big city because I have to and I constantly complain about it to my friend who wanted to move to this city so much that one day she just drove here with almost no money and no place to stay. I don’t think she’s very sympathetic.
I’m not sure what other outcome this woman could reasonably have expected.
Vance is Catholic.
The important thing here isn’t that the AI is worse than humans. It’s than the AI is worth comparing to humans. Humans stay the same while software can quickly improve by orders of magnitude.
This is what international law has to say about incendiary weapons:
- It is prohibited in all circumstances to make the civilian population as such, individual civilians or civilian objects the object of attack by incendiary weapons.
- It is prohibited in all circumstances to make any military objective located within a concentration of civilians the object of attack by air-delivered incendiary weapons.
- It is further prohibited to make any military objective located within a concentration of civilians the object of attack by means of incendiary weapons other than air-delivered incendiary weapons, except when such military objective is clearly separated from the concentration of civilians and all feasible precautions are taken with a view to limiting the incendiary effects to the military objective and to avoiding, and in any event to minimizing, incidental loss of civilian life, injury to civilians and damage to civilian objects.
- It is prohibited to make forests or other kinds of plant cover the object of attack by incendiary weapons except when such natural elements are used to cover, conceal or camouflage combatants or other military objectives, or are themselves military objectives.
This treeline is clearly not located within a concentration of civilians and it is concealing (or plausibly believed to be concealing) enemy combatants and therefore the use of incendiary weapons is unambiguously legal.
That’s actually a very low price for an anti-air missile. For comparison, the Stinger shoulder-fired missile costs more than twice as much. A Patriot missile costs four million dollars (but is much more capable). Presumably minimizing cost was a high priority when this missile was designed. Nonetheless, the cost asymmetry is one reason why degrading the ability of Hamas and Hezbollah to fire missiles at Israel is important.
I don’t think helping people is a bad thing. I’m generally in favor of a relatively high level of help (I vote for centrist Democrats, not Republicans) but I think that sometimes it is justified and appropriate to help less rather than more.
I’m not in favor of that either (with a few exceptions related to national security).
Nothing can fix things because teenagers will not cooperate. If Instagram could identify all its teenage users, those users would move to a platform that couldn’t. The only thing the restrictions achieve is a reduction in the market share of the platform with the restrictions.