The core principal of GNU from which every other principal is derived is “I shouldn’t need an ancient unmaintained printer driver that only works on windows 95 to use my god damned printer. I should have the source code so I can adapt it to work with my smart toaster”
If an app is open source then I’ve almost never encountered a situation where I can’t build a working version. Its happened to me once that I remember. A synthesia clone called linthesia. Would not compile for love nor money and the provided binary was built for ubuntu 12 or something.
Linux was probably ready for the 64-bit appocalypse even before Apple for this exact reason. Anything open source will just run, on anything, because some hobbiest has wanted to use it on their favourite platform at some point. And if not, you’d be surprised how not hard it is to checkout the sourcecode from github and make your own port. Difficult, but far from impossible.
Steam games do not distribute source code, which means they break, and when they break the community can’t fix them. They can’t statically link glibc because that would put them in violation of the GPL (as far as I’m aware anyway). They are fundamentally second class citizens on linux because they refuse to embrace its culture. FOSS apps basically never die while there’s someone to maintain them.
Its like when American companies come to Europe and realise the workers have rights and then get a reputation as scuzzballs for trying to rules lawyer those rights.
You know what, that explains how they can exist on linux at all. Because from what I understand, if glibc was GPL and not LGPL, closed source software would basically be impossible to run on the platform. Which… maybe isn’t the best outcome when you think about it. As much as I hate the Zoom VDI bridge, I don’t want “using windows” to be the alternative.
and yeah, from the source you provided, I can see why they don’t statically link. “If you dynamically link against an LGPLed library already present on the user’s computer, you need not convey the library’s source”. So basically if they bundle glibc then they need to provide the glibc source to users on request but if they just distribute a binary linked against the system one then that’s their obligations met.
Welcome to “complying with the LGPL for the terminally lazy”, I’ll be your host “Every early linux port of a steam game!”
My understanding of the linking rules for the GPL is that they’re pretty much always broken and I’m not even sure if they’re believed to be enforceable? I’m far out of my element there. I personally use MPLv2 when I want my project to be “use as you please and, if you change this code, please give your contributions back to the main project”
You have to provide the source code for the version of the library you’re linking somewhere. So basically if you ship a static linked glibc executable, you need to provide the source code for the glibc part that you included. I think the actual ideal way to distribute it would be to not statically link it and instead deliver a shared library bundled with your application.
EDIT: Statically linking libc is also a big pain in general, for exampled you lose dlopen. It’s best not to statically link it if possible. All other libraries, go for it.
As far as I know link systemcalls are set up to look in the working directory first
Not so much but that’s easily fixed with an export LD_LIBRARY_PATH=.
Why would statically compiling it violate the GPL?
Because you’ve created something that contains compiled GPL code that can’t be untangled or swapped out. The licence for the Gnu C Compiler is basically designed so you can’t use it to build closed source software. Its a deal with a communist devil. If you want to build a binary that contains GPL code (which is what glibc is) then you have to make everything in that binary licensed under a GPL compatible license. That’s what the whole “Linux is a cancer that attaches itself in an intellectual property sense to everything it touches” quote from Steve Balmer was in aid of. And he was correct and this was literally the system operating as intended.
Dynamic linking is some looney tunes ass “see, technically not violating the GPL” shit that corporations use to get around this.
From a technical standpoint, yes. From a legal standpoint:
If you dynamically link against an LGPLed library already present on the user’s computer, you need not convey the library’s source
Welcome to “what did you think was going to happen if you told for profit corporations that if they want to distribute a library in a bundle they also have to provide the source code but if they just provide it linked against an ancient version that nobody will be using in 5 years and don’t even tell you which one they’re 100% in compliance”?
Could they? yes. Will they? probably not, that takes too much work.
This is why steam’s own linux soldier runtime environment (Which is availible from the same dropdown as proton) had to become a thing.
This shit is the exact reason Linux doesn’t just have ridiculously bad backwards compatibility but has also alienated literally everyone who isn’t a developer, and why the most stable ABI on Linux is god damn Win32 through Wine. Hell, for the same reason fundamentally important things like accessibility tools keep breaking, something where the only correct answer to is this blogpost. FOSS is awesome and all, but not if it demands from you to become a developer and continuesly invest hundreds of hours just so things won’t break. We should be able to habe both, free software AND good compatibility.
What you describe is in no way a strength, it’s Linux’ core problem. Something we have to overcome ASAP.
The Linux ABI stability is tiered, with the syscall interface promising to never change which should be enough for any application that depends on libc. Applications that depend on unstable ABIs are either poorly written (ecosystem problem, not fixable by the kernel team, they’re very explicit about what isn’t stable) or are inherently unstable and assume some expertise from the user. I’d say the vast majority of programs are just gonna use the kernel through libc and thus should work almost indefinitely.
It isn’t a core problem, it’s a filter, and a damn good one. Keeps the bad behavior out of Linux. Thats why people keep turning to it for lack of enshittification. Stable ABIs are what lead to corpo-capital interests infecting every single piece of technology and chaining us to their systems via vendor lock-in.
I wish the Windows users who are sick of Windows would stop moving to Linux and trying to change it into Windows. Yes, move to Linux if you want, but use Linux.
This might be the most awful Linuxbro take I’ve read this year, congratulations. Linux has to lack a stable ABI to keep the capitalists away and make apps constantly require maintenance to filter out bad behaviour? Just wow.
I really hope for way more people to come over so nonsense like this finally stops.
No. Its not about driving away the capitalists. Its about forcing them to bend to the community. Its not “Linux has to lack a stable ABI to keep the capitalists away” its “Linux is not here to baby rich corporations and exempt them from rules that literally nobody including little timmy who’s 14 and just submitted his first PHP patch has a problem with”. This is developers who are used to living in houses trying to set up shop in an apartment complex and then finding out different rules apply and being colossal babies about it.
The point of the GNU foundation was to destroy the concept of closed source software. Which is a completely justified response to Xerox incorporated telling you your printer is no longer supported and you just have to buy a new one. Capitalists are welcome. Anti right to repair people can fuck right off and if we had the right to repair their software we wouldn’t have this problem in the first place because someone else would have already fixed it.
And that fight against closed-source and anti-consumer shit is awesome, but that changes absolutely nothing about Linux being completely awful in terms of long-term support. Running old software is a whole project (for enthusiasts) in itself almost every single time, meanwhile I can run almost any decade-old software on systems like Android or Windows simply by installing it without having to be an IT professional.
that literally nobody including little timmy who’s 14 and just submitted his first PHP patch has a problem with."
Except that this causes usability issues for the 99.99% of users who aren’t that little Timmy you just made up, and it causes accessibility tools which are freaking essential for many people to simply break. Old games becoming unplayable isn’t an issue only because of their Windows versions and Wine, dxvk etc - we literally have to fall back to Windows software to keep software running because of how badly the Linux system architecture works for desktop usage. What a disgrace.
if we had the right to repair their software we wouldn’t have this problem in the first place because someone else would have already fixed it.
Literally has nothing to do with Linux’ own problems.
Android is Linux! You’re running your decades old software, on Linux. What was the last completely unmaintained binary that you pulled on Windows and ran (with no tweaking) and the last one that failed on Linux?
Have you considered joining the community and working with it – like the author of the blog that you keep sharing – instead of trying to insult every one who works on it and calling it a disgrace?
The core principal of GNU from which every other principal is derived is “I shouldn’t need an ancient unmaintained printer driver that only works on windows 95 to use my god damned printer. I should have the source code so I can adapt it to work with my smart toaster”
If an app is open source then I’ve almost never encountered a situation where I can’t build a working version. Its happened to me once that I remember. A synthesia clone called linthesia. Would not compile for love nor money and the provided binary was built for ubuntu 12 or something.
Linux was probably ready for the 64-bit appocalypse even before Apple for this exact reason. Anything open source will just run, on anything, because some hobbiest has wanted to use it on their favourite platform at some point. And if not, you’d be surprised how not hard it is to checkout the sourcecode from github and make your own port. Difficult, but far from impossible.
Steam games do not distribute source code, which means they break, and when they break the community can’t fix them. They can’t statically link glibc because that would put them in violation of the GPL (as far as I’m aware anyway). They are fundamentally second class citizens on linux because they refuse to embrace its culture. FOSS apps basically never die while there’s someone to maintain them.
Its like when American companies come to Europe and realise the workers have rights and then get a reputation as scuzzballs for trying to rules lawyer those rights.
glibc is released under the LGPL, not the GPL. It is completely fine to statically link it.
EDIT: But there are some extra things you have to do: https://www.gnu.org/licenses/gpl-faq.html#LGPLStaticVsDynamic
Okay so bundle glibc. As far as I know link systemcalls are set up to look in the working directory first
Why would statically compiling it violate the GPL?
It wouldn’t; glibc is LGPL not GPL. The person you’re replying to was mistaken.
You know what, that explains how they can exist on linux at all. Because from what I understand, if glibc was GPL and not LGPL, closed source software would basically be impossible to run on the platform. Which… maybe isn’t the best outcome when you think about it. As much as I hate the Zoom VDI bridge, I don’t want “using windows” to be the alternative.
and yeah, from the source you provided, I can see why they don’t statically link. “If you dynamically link against an LGPLed library already present on the user’s computer, you need not convey the library’s source”. So basically if they bundle glibc then they need to provide the glibc source to users on request but if they just distribute a binary linked against the system one then that’s their obligations met.
Welcome to “complying with the LGPL for the terminally lazy”, I’ll be your host “Every early linux port of a steam game!”
My understanding of the linking rules for the GPL is that they’re pretty much always broken and I’m not even sure if they’re believed to be enforceable? I’m far out of my element there. I personally use MPLv2 when I want my project to be “use as you please and, if you change this code, please give your contributions back to the main project”
They missed the first character because they took the L
It should be noted that statically linking against an LGPL library does still come with some constraints. https://www.gnu.org/licenses/gpl-faq.html#LGPLStaticVsDynamic
You have to provide the source code for the version of the library you’re linking somewhere. So basically if you ship a static linked glibc executable, you need to provide the source code for the glibc part that you included. I think the actual ideal way to distribute it would be to not statically link it and instead deliver a shared library bundled with your application.
EDIT: Statically linking libc is also a big pain in general, for exampled you lose
dlopen. It’s best not to statically link it if possible. All other libraries, go for it.Not so much but that’s easily fixed with an
export LD_LIBRARY_PATH=.Because you’ve created something that contains compiled GPL code that can’t be untangled or swapped out. The licence for the Gnu C Compiler is basically designed so you can’t use it to build closed source software. Its a deal with a communist devil. If you want to build a binary that contains GPL code (which is what glibc is) then you have to make everything in that binary licensed under a GPL compatible license. That’s what the whole “Linux is a cancer that attaches itself in an intellectual property sense to everything it touches” quote from Steve Balmer was in aid of. And he was correct and this was literally the system operating as intended.
Dynamic linking is some looney tunes ass “see, technically not violating the GPL” shit that corporations use to get around this.
Oh, so bundling it and adding that env will work.
From a technical standpoint, yes. From a legal standpoint:
Welcome to “what did you think was going to happen if you told for profit corporations that if they want to distribute a library in a bundle they also have to provide the source code but if they just provide it linked against an ancient version that nobody will be using in 5 years and don’t even tell you which one they’re 100% in compliance”?
Could they? yes. Will they? probably not, that takes too much work.
This is why steam’s own linux soldier runtime environment (Which is availible from the same dropdown as proton) had to become a thing.
Also, your OS will tell you which library it can’t find.
Is there a site to download various .so files?
This shit is the exact reason Linux doesn’t just have ridiculously bad backwards compatibility but has also alienated literally everyone who isn’t a developer, and why the most stable ABI on Linux is god damn Win32 through Wine. Hell, for the same reason fundamentally important things like accessibility tools keep breaking, something where the only correct answer to is this blogpost. FOSS is awesome and all, but not if it demands from you to become a developer and continuesly invest hundreds of hours just so things won’t break. We should be able to habe both, free software AND good compatibility.
What you describe is in no way a strength, it’s Linux’ core problem. Something we have to overcome ASAP.
The Linux ABI stability is tiered, with the syscall interface promising to never change which should be enough for any application that depends on libc. Applications that depend on unstable ABIs are either poorly written (ecosystem problem, not fixable by the kernel team, they’re very explicit about what isn’t stable) or are inherently unstable and assume some expertise from the user. I’d say the vast majority of programs are just gonna use the kernel through libc and thus should work almost indefinitely.
It isn’t a core problem, it’s a filter, and a damn good one. Keeps the bad behavior out of Linux. Thats why people keep turning to it for lack of enshittification. Stable ABIs are what lead to corpo-capital interests infecting every single piece of technology and chaining us to their systems via vendor lock-in.
I wish the Windows users who are sick of Windows would stop moving to Linux and trying to change it into Windows. Yes, move to Linux if you want, but use Linux.
This might be the most awful Linuxbro take I’ve read this year, congratulations. Linux has to lack a stable ABI to keep the capitalists away and make apps constantly require maintenance to filter out bad behaviour? Just wow.
I really hope for way more people to come over so nonsense like this finally stops.
No. Its not about driving away the capitalists. Its about forcing them to bend to the community. Its not “Linux has to lack a stable ABI to keep the capitalists away” its “Linux is not here to baby rich corporations and exempt them from rules that literally nobody including little timmy who’s 14 and just submitted his first PHP patch has a problem with”. This is developers who are used to living in houses trying to set up shop in an apartment complex and then finding out different rules apply and being colossal babies about it.
The point of the GNU foundation was to destroy the concept of closed source software. Which is a completely justified response to Xerox incorporated telling you your printer is no longer supported and you just have to buy a new one. Capitalists are welcome. Anti right to repair people can fuck right off and if we had the right to repair their software we wouldn’t have this problem in the first place because someone else would have already fixed it.
And that fight against closed-source and anti-consumer shit is awesome, but that changes absolutely nothing about Linux being completely awful in terms of long-term support. Running old software is a whole project (for enthusiasts) in itself almost every single time, meanwhile I can run almost any decade-old software on systems like Android or Windows simply by installing it without having to be an IT professional.
Except that this causes usability issues for the 99.99% of users who aren’t that little Timmy you just made up, and it causes accessibility tools which are freaking essential for many people to simply break. Old games becoming unplayable isn’t an issue only because of their Windows versions and Wine, dxvk etc - we literally have to fall back to Windows software to keep software running because of how badly the Linux system architecture works for desktop usage. What a disgrace.
Literally has nothing to do with Linux’ own problems.
Android is Linux! You’re running your decades old software, on Linux. What was the last completely unmaintained binary that you pulled on Windows and ran (with no tweaking) and the last one that failed on Linux?
Why do you keep sharing that link instead of this one? https://fireborn.mataroa.blog/blog/i-want-to-love-linux-it-doesnt-love-me-back-post-4-wayland-is-growing-up-and-now-we-dont-have-a-choice/ The one where the same person you’ve been posting says clearly people are working on accessibility and things are improving?
Have you considered joining the community and working with it – like the author of the blog that you keep sharing – instead of trying to insult every one who works on it and calling it a disgrace?