The reality is that reliable backports of security fixes is expensive (partly because backports are hard in general). The older a distribution version is, generally the more work is required. To generalize somewhat, this work does not get done for free; someone has to pay for it.

People using Linux distributions have for years been in the fortunate position that companies with money were willing to fund a lot of painstaking work and then make the result available for free. One of the artifacts of this was free distributions with long support periods. My view is that this supply of corporate money is in the process of drying up, and with it will go that free long term support. This won’t be a pleasant process.

  • nottelling@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    1 year ago

    I could be wrong, but isn’t the entire debian stable tree maintained for years via open source contributions? Sure the redhat downstreams might be on their own, but there’s plenty of non-commercial distros that keep up to date.

    • pnutzh4x0rOPA
      link
      fedilink
      English
      arrow-up
      23
      ·
      1 year ago

      According to Debian Releases

      Debian announces its new stable release on a regular basis. Users can expect 3 years of full support for each release and 2 years of extra LTS support.

      So about 5 years, though it is not clear how well this works in practice (how much is actually updated and how well supported).

      From the Debian Wiki - LTS:

      Companies using Debian who benefit from this project are encouraged to either help directly or contribute financially. The number of properly supported packages depends directly on the level of support that the LTS team receives.

      I think this is sort of what the article is pointing towards… long-term support really depends on commercial support, as volunteers are more likely to work on the current or more recent thing than go back and backport or update older things. If corporate funding dries up (which it appears to be doing), then while volunteers will still contribute some to long-term linux distributions, it won’t be at the same level it currently is with commercial support.

    • Auli@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Yes but the real issue is the downstream have had it pretty easy and now have to do more work.

  • lemmyng@beehaw.org
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 year ago

    The rationale for using LTS distros is being eroded by widespread adoption of containers and approaches like flatpak and nix. Applications and services are becoming less dependent on any single distro and instead just require a skeleton core system that is easier to keep up to date. Coupled with the increased cost needed to maintain security backports we are getting to a point where it’s less risky for companies to use bleeding edge over stable.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      widespread adoption of containers and approaches like flatpak and nix

      And it’s about flippin time. Despite predating app stores by decades, the Linux package systems have been surprisingly conservative in their approach.

      The outdated and hardcoded file hierarchy system combined with the rigid package file management systems have ossified to a ridiculous degree.

      It’s actually telling that Linux packaging systems had to be circumvented with third party approaches like snap, flatpak, appimage etc. — because for the longest time they couldn’t handle stuff like having two versions of the same package installed at the same time, or solving old dependencies, or system downgrades, or recovery etc.

      Linux had advanced stuff like overlayfs 20 years ago but did not use any of it for packages. But we have 20 different solutions for init.

      • Manbart@beehaw.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Like everything, it’s a trade off. Windows allows different versions of the same libraries, but at the cost of an ever growing WinSXS folder and slow updates

    • tetha@feddit.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      And that skeleton of a system becomes easier to test.

      I don’t need to test ~80 - 100 different in-house applications on whatever many different versions of java, python, .net and so on.

      I rather end up with 12 different classes of systems. My integration tests on a buildserver can check these thoroughly every night against multiple versions of the OS. And if the integration tests are green, I can be 95 - 99% sure things will work right. The dev and testing environments will figure out the rest if something wonky is going on with docker and new kernels.

  • Zeth0s@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    1 year ago

    I believe that the main problem is how companies work. If I say finance that I want to donate to an open source project half of what we are paying for the licenses of the alternative rubbish commercial product we are using now, they will simply say no. No discussion at all. We always need to find commercial entities that work via licensing to support open source tools. This is also a reason of the success of red hat compared to debian. Companies don’t pay debian, I couldn’t even if I would like, because they don’t offer a package of enterprise licenses… That is the only option finance understand

    It is crazy and a pity…

    • Baron Von J@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Agreed, but there’s more to it than just “we need to pay for support contract.” There’s also “we want a contract that indemnifies us against a FOSS reciprocal license claim against the product we sell.” That is something that really contributed to RHEL’s dominant position.

  • Drito@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Linux can work without big and bloated components. Companies funded development of components that are replaceable by simpler and easy to maintain ones.

  • nyan@lemmy.cafe
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    “Somebody has to pay for it”? Nonsense. People working on community distros are not paid, and a lot of patches come out of those distros. If it’s important enough, those distros can and will do their own backports.

    Furthermore, the existence of distros running ancient kernels and software benefits industry more than it does individual users—“stability uber alles” is something you mostly need on servers and controllers for expensive devices. They’ll keep funding the backports because they need them more than they need the initial security fixes targeting the latest versions.

  • Ryan@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    This is why I tend to use rolling releases. The only non-rolling distro I use is Fedora. For me, backporting security updates seems rife with issues due to questionable familiarity with the codebase. The people working on the distro have to backport fixes to 100’s or even 1000’s of packages, and there’s a higher likelihood that they’ll introduce additional bugs because of this reduced familiarity.