systemd
Interesting. So they did it two years ago for the lols… i mean for the u8"😀"
s*…which wasn’t even really needed as one of the PR comments pointed out.
* Yes, the mission creep is so bad the shit show that is systemd
has emoticons.
systemd
Interesting. So they did it two years ago for the lols… i mean for the u8"😀"
s*…which wasn’t even really needed as one of the PR comments pointed out.
* Yes, the mission creep is so bad the shit show that is systemd
has emoticons.
Multi-threading support
Who stopped using pthreads/windows threads for this?
Unicode support
Those who care use icu anyway.
memccpy()
First of all, 😄.
Secondly, it’s a library feature, not a language one.
Thirdly, it existed forever in POSIX.
And lastly, good bait 😄.
whats so bad about Various syntax changes improve compatibility with C++
It’s bad because compiler implementations keep adding warnings and enabling them by default about completely valid usage that got “deprecated” or “removed” in “future versions of C” I will never use or give a fuck about. So my CI runs which all minimally have -Wall -Werror
can fail with a compiler upgrade for absolutely irrelevant stuff to me. If it wasn’t for that, I wouldn’t even know about these changes’ existence, because I have zero interest in them.
Those who like C++ should use C++ anyway. They can use the C+classes style if they like (spoiler alert: they already do).
I can understand. But why would you not use newer C versions, if there is no compatibility with older version “required”?
Because C doesn’t exist in a vacuum, and Rust exists. Other choices exist too for those who don’t like Rust.
My C projects are mature and have been in production for a long time. They are mostly maintenance only, with new minor features added not so often, and only after careful consideration.
Still interested in knowing what relevant projects will be using C23.
That’s good and all, but we all explicitly pass -std=gnu99
(or -std=c99
if you don’t care about MSYS2 compat) in our build scripts buddy 😉
Okay, maybe not all all. But you get the idea.
Are there any relevant projects who use the increasingly C+±infested newer versions of the language?
With hyper-threading and preemption in mind, maybe it’s concurrency all the way down 😎 . But we should definitely keep this on the down low. Don’t want the pesky masses getting a whiff of this.
🤣
I don’t know, and I don’t want to get personal. But that’s usually a sign of someone who doesn’t even code (at non-trivial levels at least)*, and thinks programming languages are like sport clubs, developers are like players contracted to play for one and only one club, and every person in the internet gantry need to, for some reason, pick one club (and one club only) to be a fanboy of. Some people even center their whole personality around such fanboyism, and maybe even venture into fanaticism.
So each X vs Y language discussion in the mind of such a person is a pre-game or a pre-season discussion, where the game or season is an imaginary competition such people fully concoct in their minds, a competition that X or Y will eventually and decidedly “win”.
* Maybe that was an exaggeration on my part. Some junior developers probably fall into these traps too, although one might expect, or maybe hope, that their view wouldn’t be that detached from reality.
I’m hoping to finally finish and send out a delayed new release for one of my older and mature CLI tools this weekend. It’s written in C btw 😄
I hope that someone in the 40 comments i don’t have time to read right now has pointed out that the premise of OP is flawed for the simple reason that Rust hit v1.0 in 2015, while Zig is still nowhere near that milestone in 2024.
So we are not even talking about the same “future” period from the start.
So, no need to get to the second false premise in OP, which is limiting a “future” period to one successful dominating language only. Nor is there a need to go beyond these premises and discuss actual language details/features.
Bringing vimperator/pentadactyl back! That would be the dream.
Anyway, last time I tested it (~3 weeks ago), servo
was not very usable still with the few websites I tried. Hopefully it gets, at least partway, there in a few months.
If you’re not into tiling, install openbox and a panel of your choosing. You will quickly find that you don’t need a DE at all.
The maintenance is too high.
acquired knowledge spotted
I will let you on a little secret.
The best “support” you can get is support from upstreams directly (I’m involved in both sides of that equation). But upstreams will often only “support” you when you 1. run the latest stable version 2. the upstream source code wasn’t patched willy-nilly by the packager (your distro).
So the best desktop linux experience comes with using rolling distro that gives you such packages, with Arch being the most prominent example.
The acquired knowledge that argues stability and tells you otherwise is a meme.
Because non-open ones are not available, even for a price. Unless you buy something bigger than the “standard” itself of course, like a company that is responsible for it or having access to it.
There is also the process of standardization itself, with committees, working groups, public proposals, …etc involved.
Anyway, we can’t backtrack on calling ISO standards and their likes “open” on the global level, hence my suggestion to use more precise language (“publicly available and sharable”) when talking about truly open standards.
The term open-standard does not cut it. People should start using “publicly available and sharable” instead (maybe there is a better name for it).
ISO standards for example are technically “open”. But how relevant is that to a curious individual developer when anything you need to implement would require access to multiple “open” standards, each coming with a (monetary) price, with some extra shenanigans [archived] on top.
IETF standards however are actually truly open, as in publicly available and sharable.
It implies that the value of their policy work is significantly below…
It’s always safe to assume that value to be negative unless proven otherwise actually.
Monthly Reminder: High or low, all Linux usage stats are fake.
BTW, the snippet I pointed to, and the whole match block, is not incoherent. It’s useless.
Alright. Explain this snippet and what you think it achieves:
tokio::task::spawn_blocking(move || -> Result { Ok(walkdir) })
Post the original code to !rust@programming.dev and point to where you got stock, because that AI output is nonsensical to the point where I’m not sure what excited you about it. A self-contained example would be ideal, otherwise, include the crates you’re using (or the use
statements).
Federation is irrelevant. Matrix is federated, yet most communities and users would lose communication if matrix.org got offline.
With, transport-only distributablity, which i think is what radicale offers, availability would depend on the peers. That means probably less availability than a big service host.
Distributed transport and storage would fix this. a la something like Tahoe-LAFS or (old) Freenet/Hyphanet. And no, IPFS is not an option because it’s generally a meme, and is pull-based, and have availability/longevity problems with metadata alone. iroh claims to be less of a meme, but I don’t know if they fixed any of the big design (or rather lack of design) problems.
At the end of the day, people can live with GitHub/GitLab/… going down for a few minutes every other week, or 1-2 hours every other month, as the benefits outweigh the occasional inconvenience by a big margin.
And git itself is distributed anyway. So it’s not like anyone was cut from committing work locally or pushing commits to a mirror.
I guess waiting on CI runs would be the most relevant inconvenience. But that’s not a distributable part of any service/implementation that exists, or can exist without being quickly gravely abused.
Definitely don’t use axum, which provides a simple interface for routes by using derived traits. Their release cycle is way shorter, which makes them more dangerous, and they’re part of the same github user as tokio, which means they’re shilling their own product.
this but (semi)-unironucally
Is this going to be re-posted every month?
Anyway, I’ve come to know since then that the proposal was not a part of a damage control campaign, but rather a single person’s attempt at proposing a theoretical real solution. He misguidedly thought that there was actually an interest in some real solutions. There wasn’t, and there isn’t.
The empire are continuing with the strategy of scamming people into believing that they will produce, at some unspecified point, complete magical
mushroomsguidelines and real specified and implemented profiles.The proposal is destined to become perma-vaporware. The dreamy guidelines are going to be perma-WIP, the magical profiles are going to be perma-vapordocs (as in they will never actually exist, not even in theoretical form), and the bureaucracy checks will continue to be cashed.
So not only there was no concrete strike back, it wasn’t even the empire that did it.