I love Stardew and played a bunch when it was first released.
He really had perfect timing with its release. The original developers and the rights holders for harvest moon had so badly fumbled for so long with bad releases or only in Japan releases etc. Someone was bound to show up in that space since there was a clear demand for that type of game. It also helps that he aped (heh) harvest moon from the super nintendo / game boy generation so it basically runs on a potato and no one needs to buy dedicated hardware.
Definitely a perfect timing situation, though with substantial risk. Considering the time the game was in development, an alternative could have showed up in the market.
However, I believe Stardew Valley’s appeal wasn’t simply of fulfilling a void in the market. It is great because there is genuine passion for the subject in the execution, and the content in the game is truly compelling for a wide audience. An amazing story.
For real I basically don't even read cover letters any more and I don't blame the applicants for generating them with LLMs. Unless you are applying for a higher level position a cover letter used to just be a mild heuristic for this person took an extra 10 minutes to alter their standard cover letter and include a different related paragraph. Now its just wasted text.
Almost everywhere I applied, and these are dozens of positions over many years, I wrote a concise, sometimes funny, sometimes provocative, sometimes insightful cover letter. If I knew something about the company that HR would find interesting, I would write it. If I knew something about the industry or the founders, I'd mention that as well.
My personal experience is that cover letters do not help at all. At best, it's a test for myself. If I don't want to write a cover letter, I should not apply.
They got good jobs because the Allies did not really care about punishing the Nazis.
At the end of WW2 a strong West Germany to oppose the USSR was more important than punishing some middle manager and the quickest way to get the West German state together was to use a lot of the existing bureaucracy.
They will tell you they are repulsed by it if asked but its a toss up if they can identify it. Look at any thread on Reddit/IG/Tiktok whatever and I personally would guess I could manage to identify AI output 20% of the time.
Boomers might be out there consuming those AI youtube videos that are just tiktok voice over with a generated slide show but Millennials think since they can identify this as slop that they are not affected. That is incorrect, and just as bad.
Like anything it really depends on what they are doing, if you wanted to just open and close a connection you might run into bottle necks in other parts of the stack before the CPU tops out but the real point is that yea, a single machine is going to be enough.
This is how I feel about this industries fetishization of "scalability".
A lot of software time is spent making something scalable when in 2025 I can probably run any site the bottom 99% of most visited sites on the internet on a couple machines and < 40k capital.
The sites that think they need huge numbers of small network interactions are probably collecting too much detailed data about user interaction. Like capturing cursor movement. That might be worth doing for 1% of users to find hot spots, but capturing it for all of them is wasteful.
A lot of analytic data is like that. If you captured it for 1% of users you'd find out what you needed to know at 1% of the cost.
Prior to the recent RAM insanity(a big caveat I know) a 1u supermicro machine with 768GB some NVME storage and twin 32 core Epyc 9004s was ~12K USD. You can get 3 of those and and some redundant 10G network infra(people are literally throwing this out) for < 40k. Then you just have to find a rack/internet connection to put them in which would be a few hundred a month.
The reality is most sites don't need multi region setups, they have very predicable load and 3 of those machines would be massive overkill for many. A lot of people like to think they will lose millions per second of down time, and some sites certainly do but most wont.
All of this of course would be using new stuff. If you wanted to use used stuff the most cost effective are the 5 year old second gen xeon scalables that are being dumped by cloud providers. Those are more than enough compute for most they are just really thirsty so you will pay with the power bill.
This of course is predicated on assumption you have the skill set to support these machines and that is increasingly becoming less common though as successful companies that started in the last 10 years are starting to do more "hybrid cloud" it is starting to come back around.
They just happen to have an online config tool that is somewhat close to what you would pay if you didnt engage with sales which is useful for a hackernews comment.
It pays to be the middle man!
reply