Linux didn’t exist when I was 12. 😑
Good grief! The word is excluded. Holy shit.
I think that being forced to learn about WINE at a young age may have been beneficial actually (if extremely unpleasant)
So unpleasant.

For some reason, Eternity shows this image until I clicked on the post lol
So I started with a DOS machine that my dad had at work, then my school got a few Apple Macs in the library so I played Oregon Trail on the green screen, them the first computer we had at home that I was able to spend hours on was windows 3.1.
Das wirft natürlich eine sehr interessante wissenschaftliche Forschungsfrage auf, die ich mir erlaubt habe, in der wissenschaftlichen Literatur zu recherchieren:
“Does early exposure to different operating systems (macOS vs. Windows) correlate with differences in technological literacy and general problem-solving abilities among children and adolescents?”
The available research does not provide conclusive evidence that early exposure to different operating systems directly correlates with differences in technological literacy or problem-solving abilities among children and adolescents.
While studies reveal some interesting distinctions, the evidence is limited. Ronaldo Muyu et al., 2022 found Windows is more popular among university students (84.61% vs. 11.38% for macOS), suggesting potential usage differences. Shahid I. Ali et al., 2019 found no significant competency differences between Mac and Windows users in Excel skills. Cem Topcuoglu et al., 2024 noted that users’ perceptions of operating systems are often based on reputation rather than technical understanding.
Interestingly, Bijou Yang et al., 2003 found Mac users had significantly greater computer anxiety, which might indirectly impact technological literacy.
More targeted research is needed to definitively answer this question, particularly studies focusing on children and adolescents.
I think early exposure to several different OS’s means you’re at least not too poor, and lack of money does correlate a lot with illiteracy of all sorts.
I think you misunderstand: the question is not about exposure to different OSes, but about the correlation/causation of a given OS to later cognitive (and other) abilities. Please do apply adequate scientific rigor here!
What about people who started on DOS?
Or AmigaOS?
Or Basic 2.0?
They are either database administrators or completely oblivious to modern technology
First computer I used was DOS.
Also DOS, the single button on the Mac mouse was a whole new way of using a computer.
Mine had 3.1 on it, but most of the games had to b3 run through dos prompt
deleted by creator
what operating system was that atari with a keyboard you could plug 2800 carts into
Atari OS, which could only be used to access the floppy drive. Atari DOS could be booted from a floppy disk. I never used one of these machines, I skimmed the Wikipedia article on Atari 8-bit computers..
That was also AutistOS
*Reads comments in thread*
I started with a pair of matchsticks and a trenchcoat that I got at Galipoli in WW1, using the Phosphorus I found in the Bosphorus to craft makeshift TI calculator based on specs I got via Fax from a Samurai. I ran slackware on my slacks until we defeated the Ottomans, but they unleashed their puppy linuxes on us, and we stood no chance.
Ummm how do kids turn out if you install Linux Mint on a cheap laptop and give it to them to screw around with? Asking for a friend.
It leads the kid to Arch. I hope you prepared to always hear “I use Arch, btw.”
I’ll let you know in 10 years.
Nice…I meant, I gave my 7yo (at the time) a computer we put Mint on it. He is 9 now, so by 19 I think we will see how it has changed his skill level vs the gen pop
BAAAABE, I WANT A KID.
My cousin became an IT tech. I set her up with Ubuntu on a cheap desktop when she was about 12.
My 8 and 9 year old kids use xubuntu on a 2013 macbook air. They use it for writing stories, making a lot of pixel art with Piko Pixel, and some code block style programming with Lego Spike. They are learning about multi-user systems, file management, etc. I’m keeping an eye out for a cheap pc that can run Minecraft (lots of those right now since people are just trashing old win 10 machines) because the older kid wants to learn how to make Minecraft mods.
“discluded”
🤣
De-un-cluded even
I thought so too, but turns out it is a word, even if it might be misused here: https://english.stackexchange.com/questions/129015/is-disclude-a-word-and-what-authority-says-a-word-is-a-word-or-isnt
Notice that ain’t a dictionary.
We do not wish to exclude the population because it would preclude comparative analysis, but we wish to disclude them from this study in order to conclude the initial hypothesis.
In disclusion I should probably learn my engrish.
Now include perclude and reclude! (Ok, I’m afraid English forgot to loot the last two from Latin’s pockets, after she robbed her in a dark alleyway)
Not to intentionally interclude, but perclude and reclude seem to have seclude from english.
I started on a Mac from Apple’s bad days. The school computers were Windows and it felt like all the other kids had Windows computers at home. I think feeling like I was at the disadvantage probably had an effect on me that led me to Linux. Also the second family computer ran Windows ME, so…
I started out with old Macs running System 7, and it was great. I had several good games installed from floppy disks and found some great shareware games online when we got our first modem and internet
I think that when you started matters a lot.
I’ve seen so many people on the “Only Millennials know how to use computers” and just kinda forgetting how many of this cohort didn’t get their hands on a computer until that first generation of Apples and Dells ended up in resale shops or on eBay for deep discounts.
So many folks who see kids on touch screens and throw fits, because that’s not how a “real computer” works, were throwing fits at their parents ten years ago for not understand how intuitive a touch screen is.
Feels like its all an excuse for people to get mad at one another, while occluding the simple fact that using a thing for a long time gives you more experience with the thing.
Yes, people keep finding ways to put others down in order to feel superior. It’s called being a bully. When everything was “blame and shame millenials for this”, there was a section of us millenials that swore we’d break the cycle of generational blaming. Now it’s all about blaming and shaming Gen-Z, because that shit gets clicks. Apparently being a bully never really goes out of style.
It’s not really so much the form factor of the hardware. I think it’s more to do with the increasing complexity of the apps and how they’re designed to hide a lot of what goes on behind the scenes. Think about how the earliest versions of Android didn’t even come with a basic file browser, for example.
It’s the overall push to turn computers into single-use appliances, rather than general purpose devices.
Think about how the earliest versions of Android didn’t even come with a basic file browser, for example.
They didn’t offer an official app, but the Google Store was flooded with 3rd party alternatives practically the day the OS was released.
Even then, knowing what an “App Store” is and how/why you’d use it is a skill more common among younger users. My mother, who happily goes on her laptop and installs all sorts of garbage, had no idea how to add an app to her phone. My younger coworkers are much more comfortable working through Citrix and other Cloud Services, because they don’t expect a file system to be living under every app they use.
It’s the overall push to turn computers into single-use appliances, rather than general purpose devices.
I more felt that the phone was becoming a kind of mono-device or universal remote, with a lot of the IoT tech trying to turn it into an interface for every other kind of physical appliance. If anything, I feel like the phone does too much. As a result, its interface has to become more digital and generic and uniform in a way that makes using distinct applications and operations confusingly vague.
But growing up in an analog world has definitely tilted my expectations. Younger people seem perfectly fine with syncing their phones to anything with a receiver or RFID tag. And the particularly savvy ones seem interested in exploiting the fact that everything has a frequency. I’ve met more than a few kids who have fucked around with the Flipper and other wireless receiver gadgets.
Absolutely. But I don’t think it’s crucial. If you test a bunch of 30 year olds on tech literacy and one started using a computer at 29, he will perform bad. But if you test a child at 12 who has had a pc for 2 years and a 30 year old person who has had a pc for 2 years, it becomes so irrelevant that just interest in the topic will determine the outcome. Though children do of course find everything interesting.
I think the reason why we have the perception about children learning fast is due to focus. They have unique abilities with their new little unstuffed heads, while a grown up will worry about not understand, thinking about something else entirely, not having time etc…
I mean younger brains do have more neuroplasticity and other factors, hence it’s easier for children to learn more languages than adults. I assume this applies to more than just language.
it’s easier for children to learn more languages than adults
Kids are also assumed to operate at a child’s language level. So an 8-year-old speaking both English and Spanish at the 1st grade level is impressive. But a 20-year-old speaking at a 1st grade level is considered remedial.
Even then, there’s a lot to be said for experience. Computers and languages alike benefit from years of exposure. A large English vocabulary will help you pick up Spanish faster. And many years of experience on an Apple will clue you into tricks a naive Windows/Linux user would never consider.
I remember my dad trying to limit my screen time by putting a password lock on the screen saver. He was shocked to discover that an eight year old figured out how to evade it by… restarting the computer. But then he enabled password on restart and got cagey when typing it in, and that slowed my Hackerman attempts down significantly.
Kids tend to learn basic things faster. But they lack the breadth of experience to recall and apply strategies and patterns they’ve accrued over a lifetime. So much of what we consider “smart” versus “dumb” in problem solving is just “how many times have you already seen the answer to this question applied successfully?” Figuring something out for the first time is always harder than applying the heuristic you’ve been using half your life.
You raise really good points, but I’d want to add that abstract thought and the general ability to “think around corners” are at least as important (if not more so the higher you go) as experience insofar as problem solving and “smarts” go.
They do have more neuroplasticity. But we have to define what that means and where this phenomenon comes from. Most just assume that younger equals better. This is not the case. You can even keep the neuroplasticity you have had as a child. One of the defining characteristics of neuroplasticity is the ability to adapt to new views. Since children have no views yet, they have no conflicting views either, causing acceptance. This is not the case with adults. But you can instrumentalize such knowledge to essentially undo your conflicting nature to increase neuroplasticity immensely. There is a cutoff and you will have a drop in potential for neuroplasticity as you age, but this is not in your child years. If you’re interested, I remember reading a study of life-long meditation on alzheimers. Maybe you can find them again. This is just one technique to increase neuroplasticity. I didn’t want to mention this part, as it is against common knowledge. But common knowledge is rarely correct.
Source: I have had to deal with significant loss of neural function through mental illness and have read up a lot about this topic to better myself.
I meant more that it makes a difference if your first computer was a Macintosh SE vs a MacBook Air.
Yeah my thought exactly I was raised on Macintosh and it’s completely different than the current apple product experience.
Ah, that’s true and I misunderstood you then. Though it’s hard for me to understand how that implies tech literacy in a total sense, since my grandpa has had a pc in and from the 20th century and while literate on tech given his generation, he is really not comparable to gen y-alpha. I would also wonder how this compares to devs, since most of them grow up relatively the same. PC in front of them, seeing code, monkey see monkey do, wam bam bap, software developer. I am in my 20s so I do not have knowledge about the first personal computers.
Agreed, I think it’s the main thing. My parents at the very least were firm believers in using computers from an early age, so as far back as I could remember I had my own PowerMac G3. With the rad blue monitor and round mouse.















