Used all of these three. I don’t want to even look at MS Visual C/C++ ecosystem.

  • The non standard byte lengths Microsoft still maintains are all perfectly valid within the C/C++ spec. They sure chose interesting defaults, but the issue is that the language spec basically defines a byte and says “as long as short is shorter than int and long isn’t shorter than int you’re probably good, good luck lol”.

    Microsoft did royally fuck up with VC++ 6 adding tons of non standard extensions in ways that weren’t immediately recognisable (though they were improvements tk the language) but they’ve mostly corrected themselves since.

    If data structures weren’t working with MSVC, you’re probably working with non-portable code in the first place. Don’t assume an int is 32 bits long! I’ve tried compiling a library or two for microcontrollers and people sure like to pretend that the standard defines them to be 32 bit at minimum! Why can’t people usebint32_t if they need to hold numbers of a certain size, for fuck’s sake.

    It’s pretty funny how MSVC went from “invented their own dialect” to “one of the most complete compilers when it comes to modern standards”. Their error messages have improved massively as well. Their tools just don’t do Linux or obscure architectures, malking them a rather regrettable choice for many open source projects out there.

    • JGrffn@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      If data structures weren’t working with MSVC, you’re probably working with non-portable code in the first place. Don’t assume an int is 32 bits long!

      Oh absolutely! I was starting out during this time, and started using memcpy for a uni project, hardcoding byte sizes to what I assumed long’s size was, instead of checking or using standardized data types (because I didn’t even know they existed). The result was such a mess, exacerbated by the good ol “let’s write it all in one go and run it when we’re done”. Boy did I suffer in that class.