• 2 Posts
  • 136 Comments
Joined 2 years ago
cake
Cake day: August 8th, 2023

help-circle



  • If it works, I don’t update unless I’m bored or something. I also spread things out on multiple machines, so there’s less chance of stuff happening like you describe with the charts feature going away. My NAS is pretty much just a NAS now.

    You can probably backup your configs/data, upgrade, then deploy jellyfin again, restore, and reconfigure. You should probably backup your data on your ZFS pool. But, I recently updated to the latest TrueNas Scale from ~5 year old FreeBSD version of TrueNas and the pools still worked fine (none of the “apps” or jails worked, obviously). The upgrade process even ported my service configurations over. I didn’t care about much of the data in the pools, so only backed up the most important stuff.


  • I personally use a dual core pentium with 16GB of RAM. When I first installed TrueNas (FreeNas back then), I only had 8GB of RAM, but that proved to be not enough to run all the services I wanted, so I would suggest 12-16GB. Depending on the services you want to run any multi-core x86 CPU that allows 16GB of RAM to be used should be adequate. I believe TrueNas recommends ECC RAM, but I don’t think using consumer grade RAM and hardware has caused me any problems. I’m also using an old SSD for the system drive, which I is recommended now (I used to use 2 mirrored USB thumb drives, buy that’s not recommended anymore). Very importantly, make sure the HDD(s) you get are not shingled drives; made that mistake initially, and performance was ridiculously bad.


  • Yeah. If you’re a minor you have to take Drivers Ed that requires a couple hours of driving with an instructor. If you’re an adult, you can just take the written and driving test. I think I just drove around the block, and did a reverse parking test for my driving test. Depending on where you live, roundabouts are not common here. I don’t think I saw one IRL until I was in my late 20s when I moved to a different state.



  • Marginal cost doesn’t always decrease. More people buying gold or whatever won’t decrease the price of gold. The cheapest way to feed cattle is to just let them graze, but there isn’t enough land on Earth for everyone to eat as much beef as Americans, even if using intensive agriculture to grow feed (which degrades the soil over time and results in large amounts of greenhouse gas emissions). I don’t think there’s enough land on Earth to maintain the current human population for very long. I.e. I think we are in the overshoot phase of a boom and bust population dynamic. Saw this graphic a while back, and it’s wild how much of the biomass we’ve took over:



  • Some people’s aversion of algorithms on the fediverse kind of reminds me of people’s aversion of GMO food. Genetically modifying rice to contain more vitamin D is probably good; genetically modifying vegetables to contain more cyanide would probably be bad. Algorithms don’t have to be built to maximize “engagement;” they can be designed to maximize other metrics, or balance multiple metrics, or be user-customizable.

    IMO, Mastadon is much worse off for their refusal to implement any kind of algorithm outside their “explore” feed. When I tried using Mastodon, search was unhelpfully in chronological order, and my home feed just got overtaken by the people that post the most. In contrast, Lemmy’s handling of algorithms is pretty good, imo.

    As bad as search engines are now, they’d be even worse if they just gave you results in chronological order.






  • The PC I’m using as a little NAS usually draws around 75 watt. My jellyfin and general home server draws about 50 watt while idle but can jump up to 150 watt. Most of the components are very old. I know I could get the power usage down significantly by using newer components, but not sure if the electricity use outweighs the cost of sending them to the landfill and creating demand for more newer components to be manufactured.




  • Last time I looked it up and calculated it, these large models are trained on something like only 7x the tokens as the number of parameters they have. If you thought of it like compression, a 1:7 ratio for lossless text compression is perfectly possible.

    I think the models can still output a lot of stuff verbatim if you try to get them to, you just hit the guardrails they put in place. Seems to work fine for public domain stuff. E.g. “Give me the first 50 lines from Romeo and Juliette.” (albeit with a TOS warning, lol). “Give me the first few paragraphs of Dune.” seems to hit a guardrail, or maybe just forced through reinforcement learning.

    A preprint paper was released recently that detailed how to get around RL by controlling the first few tokens of a model’s output, showing the “unsafe” data is still in there.



  • Yeah, the company I was working at got bought out and then they layed the entire tech team and pretty much everyone else. Co-founded a business with coworkers, but it’s not bringing in any revenue and not sure it ever will bring in very much, so have been applying to jobs. Only got a few interviews, then ghosted afterwards. I’m guessing a part of it is I have a criminal charge pending, and the first thing you see on Google when you search my name and town is one of those mugshot websites. Maybe I should go into construction, lol.