The very next ask will be "order the zipcodes by number of customers" at which point you'll be back to aggregations, which is where you should have started
Anti-Patterns You Should Avoid: overengineering for potential future requirements. Are there real-life cases where you should design with the future in mind? Yes. Are there real-life cases where DISTINCT is the best choice by whatever metric you prioritize at the time? Also yes.
I hear you. It's not all _that_ uncommon for me to query for "things with more than one instance". Although, to be fair, it's more common for me to that when grep/sort/uniqing logs on the command line.
Here we start to get close to analytics sql vs application sql, and I think that's a whole separate beast itself with different patterns and anti-patterns.
"Figure out which codes they can use to get the most revenue" is a billion dollar industry with many players, subspecialties and surprisingly few lawsuits.
A lack of lawsuits can just be an off the record agreement that no one benefits from the entire mess being dragged in front of the courts with public record laws, because that is how you give future Luigis ideas.
The more shady the industry, the more everyone involved is shying awaa from sunlight.
Long wait times are not exclusive to public healthcare. My dermatology appointment to examine a concerning mole was scheduled 9 months out. And of course I pay for the privilege.
I implemented a casino in assembly for college. Started with a Mersenne Twister and added a few pure chance games like roulette and slots.
The PRNG was trivial. Managing the user's bank balance, switching between games, making the roulette wheel 'spin' by printing a loop of numbers slower and slower was painstaking, error ladden work.
State doesn't just contributing to complexity, it is complexity.
The key is goodness/badness of advice is a function of the receiver. The internet doesn't give you control over who reads your stuff, so internet advice is safer and less useful than it could be.
The advice "use 'any' if it's too much work to type" is dangerous/bad advice for some developers because they don't have a well tuned definition of 'too much work', and they might not have all the tricks in the toolbox for every situation.
But legacy code or poorly typed libs can be an infinity time suck, and the most pragmatic approach might be to cut your losses, slap an 'any' on it and move on.
A great mentor gives the best (different axis than good/bad or safe/danger) advice for an individual in a specific situation.
In healthcare, notes are directly correlated with $$$ for the hospital, because everything that is billed for must be documented with a mix of metrics (O2%, temp, lab results), events (orders, prescriptions, procedures) and notes (consultation notes, imaging interpretations, discharge summaries).
Billions get spent annually in administrative overhead focused on squeezing the most money out of these notes as possible. A tremendous expense can be justified to increase note quality (aka revenue, though 'accuracy/efficiency' is the trojan horse used to slip by regulators).
GenAI has a ton of potential there. Likewise on the insurance side, which has to wade through these notes and produce a very labor intensive paper trail of their own.
Eventually the AIs will just sling em-dashes at each other while we sit by pool.