I think they should at least do an annual “Year of the Linux desktop!” episode!
I think they should at least do an annual “Year of the Linux desktop!” episode!
I wouldn’t say it has anything to do with the financial affluence of the gamer, but I agree with you that the vast vast majority of gamers simply do not care. Like with a lot of things, that same majority would be better off if they did.
Honestly you’re spoiled for choice when it comes to cpus, anything you’re looking at should for the most part “just work” as long as it’s within the last 3-8 generations of cpus (I’d recommend the last 2, since they significantly improved power efficiency and you’re going for a laptop). What you’ll mainly want to consider is linux support for the system devices (wifi, etc, etc) which you can Google per model and robustness of the device (which is slightly subjective, but a 1.1lbs 5mm thick whatever is generally less robust than say a ThinkPad).
Also there’s a decent difference in “I want to spend the entire time disconnected” and “I want to enjoy nature for the day, crack a beer by the fire, and then game a little before tucking in for the night”. All respect to the former, but it’s not for everyone and doesn’t need to be.
Go with solid state. Lighter and it’ll last much longer. https://yoshinopower.com/?srsltid=AfmBOop4ZfctUfN3-XeCk8RO2vJY9pKENf9lsin80O5n-z8iinHKzerR
Ah, that makes sense. The ol’ Google plus gambit.
This was only active if you signed up for their local guide program iirc.
Not to defend nvidia entirely, but there are physical cost savings that used to occur with actual die wafer shrinkage back in the day since process node improvements allowed such a substantial increase in transistor density. Improvements in recent years have been lesser and now they have to use larger and larger dies to increase performance despite the process improvements. This leads to things like the 400w 4090 despite it being significantly more efficient per watt and causes them to get less gpus per silicon wafer since the dies are all industry standardized for the extremely specialized chip manufacturering equipment. Less dies per wafer means higher chip costs by a pretty big factor. That being said they’re certainly… “Proud of their work”.
You’re not the asshole, you’re playing with the wrong group. Find people that want to play the same style of game as you, don’t play with people just because they’re who you’re familiar with. Recipe for a bad time.
Depends on the game. Apex, Riot, ubisoft, and EA all ban vm players. A list of other companies do as well.
Easy way to get yourself banned in online games just an FYI. Most online games will detect and ban virtual machines now since they’ve become commonplace in cheat/hack communities.
Reddit is dead to me, and given their stance on their apis, should be dead to pretty much all hobbiests deeply interested in self hosting.
I’d recommend against it. Apple’s software ecosystem isn’t as friendly for self hosting anything, storage is difficult to add, ram impossible, and you’ll be beholden to macOS running things inside containers until the good folks at Asahi or some other coummity startup add partial linux support.
And yes, I’ve tried this route. I ran an m1 mac mini as a home server for a while (running jellyfin and some other containers). It pretty consistently ran into software bugs (less maintained than x64 software) and every time I wanted to do an update instead of sudo whateveryourdistroships update, and a reboot, it was an entire process involving an apple account, logging into the bare metal device, and then finally running their 15-60 minute long update. Perfectly fine and acceptable for home computing, but not exactly a good experience when you’re hosting a service.
Wait… You want us to pay humans? - Every triple A gaming company since 2010.
How I imagine you responding to your singular downvoter:
While I almost completely agree with you, never underestimate the power of using the right tool for the right job. HDMI is actually far more resilient to signal corruption in my experience than display port since it implements TMDS and the cables are more commonly well shielded since they expect them to be used in device dense environments, which isn’t really applicable to anyone familiar with technology (don’t group up your cables next to something with significant RF noise/leaks, duh.) but does matter for the end user use case these see. The fees hdmi charge are a scam though fr and we could ask better from the industry.
Yeah… That type of brainwashing is so commonplace now though. Just look at how the US is treating striking dock workers, people keep talking about how they make xxx,xxx and not how the ceos make xxx,xxx,xxx,xxx like it’s the workers being greedy… 🥲
N…not quite…