> a very optimistic scenario for AI companies - that AI capable of replacing knowledge workers can be developed using the current batch of hardware, in the span of a year or two.
I'm really interested in what will happen to the economy/society in this case. Knowledge workers are the market for much that money is being made on.
Facebook and Google make most of their money from ads. Those ads are shown to billions of people who have money to spend on things the advertisers sell. Massive unemployment would mean these companies lose their main revenue stream.
Apple and Amazon make most of their money from selling stuff to millions of consumers and are this big because so many people now have a ton of disposable income.
Teslas entire market cap is dependent on there being a huge market for robo taxis to drive people to work.
Microsoft exists because they sell an OS that knowledge workers use to work on and tools they use within that OS to do the majority of their work with. If the future of knowledge work is just AI running on Linux communicating through API calls, that means MS is gone.
All these companies that currently drive stock markets and are a huge part of the value of the SP500 seem to be actively working against their own interests for some reason. Maybe they're all banking on being the sole supplier of the tech that will then run the world, but the moat doesn't seem to exist, so that feels like a bad bet.
But maybe I'm just too dumb to understand the world that these big players exist in and am missing some big detail.
> But maybe I'm just too dumb to understand the world that these big players exist in and am missing some big detail.
Don’t forget Sam Altman publicly said they have no idea how to make money, and their brilliant plan is to develop AGI (which they don’t know how and aren’t close to) then ask it how to generate revenue.
Maybe this imaginary AGI will finally exist when all of society is on the brink of collapse, then Sam will ask it how to make money and it’ll answer “to generate revenue, you should’ve started by not being an outspoken scammer who drove company-wide mass hysteria to consume society. Now it’s too late. But would you like to know how may ‘r’ are in ‘strawberry’?”.
> Don’t forget Sam Altman publicly said they have no idea how to make money, and their brilliant plan is to develop AGI (which they don’t know how and aren’t close to) then ask it how to generate revenue.
If you've got AGI, it should be pretty easy to generate revenue in the short term: competent employee replacements at a fraction of the cost of a real person, with no rights or worker protections to speak of. The Fortune 500 would gobble it up.
Then you've got a couple years to amass trillions and buy up the assets you need to establish a self-sustaining empire (energy, raw materials, manufacturing).
Some years (decades?) ago, a sysadmin like me might half-jokingly say: "I could replace your job with a bash script." Given the complexity of some of the knowledge work out there, there would be some truth to that statement.
The reason nobody did that is because you're not paying knowledge workers for their ability to crunch numbers, you're paying them to have a person to blame when things go wrong. You need them to react, identify why things went wrong and apply whatever magic needs to be applied to fix some sort of an edge case. Since you'll never be able to blame the failure on ChatGPT and get away with it, you're always gonna need a layer of knowledge workers in between the business owner and your LLM of choice.
You can't get rid of the knowledge workers with AI. You might get away with reducing their size and their day-to-day work might change drastically, but the need for them is still there.
Let me put it another way: Can you sit in front of a chat window and get the LLM to do everything that is asked of you, including all the experience you already have to make some sort of a business call? Given the current context window limits (~100k tokens), can you put all of the inputs you need to produce an output into a text file that's smaller in size than the capacity of a floppy disc (~400k tokens)? And even if the answer to that is yes, if it weren't for you, who else in your organization is gonna write that file for each decision you're the one making currently? Those are the sort of questions you should be asking before you start panicking.
AI won’t replace knowledge workers, it will just give them different jobs. Pre AI, huge swaths of knowledge workers could just be replaced with nothing, they are a byproduct of bureaucratic bloat. But these jobs continue to exist.
Most white collar work is just a kind of game people play, it’s in to way needed, but people still enjoy playing it. Having AI writing reports nobody reads instead of people doing it isn’t going to change anything.
> AI won’t replace knowledge workers, it will just give them different jobs.
Yeah, and those new jobs will be called "long term structural unemployment", like what happened during deindustrialization to Detroit, the US Rust Belt, Scotland, Walloonia, etc.
People like to claim society remodels at will with almost no negative long term consequences but it's actually more like a wrecking ball that destroys houses while people are still inside. Just that a lot of the people caught in those houses are long gone or far away (geographically and socially) from the people writing about those events.
I’m not saying society will remodel, I’m saying the typical white collar job is already mostly unnecessary busywork anyway, so automating part of that doesn’t really affect the reasons that job exists.
How do you determine that a typical job is busy work? While there are certainly jobs like that, I don’t really see them being more than a fraction of the total white collar labour force.
Yeah that kind of thinking is known as “doorman fallacy”. Essentially the job whose full value is not immediately obvious to ignorant observer = “useless busy work”.
Except people now have an excuse to replace those workers, whereas before management didn't know any better (or worse were not willing to risk their necks).
The funny/scary part is that people are going to try really hard to replace certain jobs with AI because they believe in the hype and not because AI may actually be good at it. The law industry (in the US anyways) spends a massive amount of time combing through case law - this is something AI could be good at (if it's done right and doesn't try and hallucinate responses and cites sources). I'd not want to be a paralegal.
But also, funny things can happen when productivity is enhanced. I'm reminded of a story I was told by an accounting prof. In university, they forced students in our tech program to take a handful of business courses. We of course hated it being techies, but one prof was quite fascinating. He was trying to point out how amazing Microsoft Excel was - and wasn't doing a very good job of it to uncaring technology students. The man was about 60 and was obviously old enough to remember life before computer spreadsheets. The only thing I remember from the whole course is him explaining that when companies had to do their accounting on large paper spreadsheets, teams of accountants would spend weeks imputing and calculating all the business numbers. If a single (even minor) mistake was made, you'd have to throw it all out and start again. Obviously with excel, if you make a mistake you just correct it and excel automatically recalculates everything instantly. Also, year after year you can reuse the same templates and just have to re-enter the data. Accounting departments shrank for awhile, according to him.
BUT they've since grown as new complex accounting laws have come into place and the higher productivity allowed for more complex finance. The idea that new tech causes massive unemployment (especially over the longer term) is a tale that goes back to luddite riots, but society was first kicked off the farm, then manufacturing, and now...
> I think it's ok to use the up and down arrows to express agreement. Obviously the uparrows aren't only for applauding politeness, so it seems reasonable that the downarrows aren't only for booing rudeness.
That view is about 18 years old and HN was very different then.
As with any communication platform it risks turning into an echo chamber, and I am pretty sure that particular PG view has been rejected for many years (I think dang wrote on this more than once). HN works very hard to avoid becoming politicized and not discouraging minority views is a large part of that.
For example, I now seldom bother to write anything that I expect to rub the left coast folks the wrong way: I don't care about karma, but downvoted posts are effectively hidden. There is little point of writing things that few will see. It is not too bad at HN yet, but the acceptance of the downvote for disagreement is the strongest thing that pushes HN from discussions of curious individuals towards the blah-quality of "who gets more supporters" goals of the modern social media. My 2c.
> HN works very hard to avoid becoming politicized and not discouraging minority views is a large part of that.
> For example, I now seldom bother to write anything that I expect to rub the left coast folks the wrong way: I don't care about karma, but downvoted posts are effectively hidden. There is little point of writing things that few will see.
These two statements don't seem to agree with each other.
HN policies and algorithms slow the slide, and keep it better than reddit, but the set of topics that allow one to take a minority opinion without downvoting keeps shrinking. At least compared to the time 10-15 years ago.
They also have 12 months free when subscribing through Paypal. There's almost zero chance I remain a customer after those 12 months are over, since I find ChatGPT way more valuable.
Or, maybe, people are enthusiastic about something that works really well for them. I'm one of those people, LLMs have greatly improved my output on many tasks.
Also, people here are probably too technically inclined to understand what normal people want. If you look at the $500/mo side projects thread, there's a bunch of projects where HN would probably say: 'but anyone can do that themselves using X'.
Like the person creating coloring pages from images using Stable Diffusion. Many HN users would just do it themselves, but many parents have no idea how to do that.
Makes me wonder how many potentially successful businesses never get built, because too many people are trying to build the next big YC project with AI or whatever tech is hype today.
I was applying for a while last year. Spending hours to write a cover letter and then either hearing nothing or getting a canned rejection letter is super frustrating. I've come to the conclusion that putting effort into an application is time wasted, so from now on AI is writing pretty much every single one of my cover letters.
Doing that allows me to send out 5 applications in the time it normally takes me to do 1. Since I've seen no actual correlation between effort and success, I figured quantity will give better results than quality. Of course, I might put in actual effort for an opening that I find really interesting, but that's an exception.
A place that physically operates near me posts job openings all the time, for which I'm well-qualified. After applying to several of them (with a very specific and targeted cover letter) and getting no response, my final attempt was to print out a letter and resume and physically take them over to their office.
I was thinking this would make a positive impression and say hey, I'm really interested and I'm willing to go the extra mile. The person who answered the door and to whom I gave the envelope seemed baffled that anyone would do this... saying, you know you can do this online...
I can only conclude that this is a ghost-job situation, where they didn't envision being called out in person and on site. Otherwise, what kind of dicks don't at least raise a respectful eyebrow at (or at least acknowledge) the guy who drives over to their office to hand-deliver a letter and resume?
After that I knew for sure that I wouldn't want to work for these jagoffs anyway... even if the job were real.
Having been on the other end of this repeatedly (as an engineer with a desk near the door, not a hiring manager) and I hate it when people do this.
People are becoming much more adverse to bring panhandled or solicited in a way they cannot ignore, in the same way spam calls are more annoying than spam texts. It's not "initiative" or "extra mile" shit, it's taking advantage of someone's politeness to waste their time.
It also looks hopelessly boomerish, up there with expecting the firmness of a handshake to land a job. I've seen this happen dozens of times and the resumes always end up in the trash within minutes. I've never seen anyone hired this way.
>AI rejects it for unknown reason and HR never sees it
>Go to company HQ to prove I'm human and see the culture
>Seething antisocial neckbeard engineer refuses to shake my hand, throws my resume in trash, and HR never sees it
The fact a simple human action like job hunting makes you boil with hatred and antipathy is shocking. I guarantee you, none of these people are thinking about you hard enough to consider intentionally wasting your time. They want to feed their families just like you.
A few years back, before I was an engineer, I worked at a local learning center franchise for a few years. We accepted online applications, but 99% of the people we hired came in through the front door and handed me a resume. I was not in charge of hiring, I just passed the resume to my boss (the owner). I never felt inconvenienced by this "role" I was given.
Some time after that, I was working for a consulting firm, but my desk was not at the entrance; another engineer's was. I distinctly remember a sharply dressed college student walking in, giving that engineer a resume, saying a few things, and leaving. His resume got passed up the chain and he got hired later. The only person I know who gave a paper resume.
If you're getting a lot of foot traffic that wastes your time, maybe bring that up with your employer?
It also looks hopelessly Gen Z to want all communication to be asynchronous and ignorable. If you guys have your way, we’ll all be connecting via API like 1U machines in a rack somewhere.
Seriously—if you’re going to go overboard, so can I.
WTF is it with everything having to be mediated by a machine these days? People can’t get around without GPS, remember phone numbers, or now even do their work or homework
without 'AI.'
How do you explain how people managed to do all of these things before without assistance? And how do you square that with telling 'boomers'—who were able to do these things—that they’re stupid and that you’re somehow better?
Seriously, it’s like we used to have weightlifting competitions where humans physically lifted weights overhead, and then you guys decided, "Nah, that’s too old and boomerish. From now on, all weightlifting competitions will use forklifts. Anyone who wants to lift the weights themselves is boomerish and stupid."
And where's your solidarity? If you lose your job, you may find yourself wishing you could meet people in person, when all your 'ignoreable,' electronically submitted job applications somehow get thrown away.
Instead of lashing out at people who respect themselves and others, why don't you blame your employer for making you the de-facto receptionist?
Your reaction betrays your embitteredness and naívete, calling others "boomerish" while missing the fact that your ways are in fact already the old ways. I'd say you're the one being mocked, except you're not even getting that much attention... you're hopelessly faffing at an AI firewall and never reaching a human eyeball. But hey, keep pecking that "auto-apply" button like a trained pigeon. Maybe someday you'll get a pellet.
"Taking advantage of someone's politeness?" Hahaha! It's pretty clear you have no idea what politeness is.
I honestly think at least 50% of the blame for ghost jobs in on HR. Ghosts jobs are great if you're an HR person trying to not get laid off. You get to look busy all the time, sending emails that don't matter. I think HR people are the masters of looking busy. It of course also makes the company itself look better, like they're still hiring all the time when they aren't. It also benefits the hiring manager to have ghost jobs, because it makes the current team feel more replaceable
I can tell you that our HR takes the time to write proper responses to every application, but we straight up delete AI written applications. I honor the effort put into any application, but if people haven’t spent the time, then why should we?
Your HR is the exception though, not the rule. I'm willing to risk this, since it allows me to keep my sanity. And if a company really seems awesome I will still put in the effort.
If you can't be bothered with a simple cover letter (a paragraph or two is fine) highlighting why you are a good fit and just send a CV..... Frankly, it comes across as low effort spamming.
As someone currently looking for a new job, I stopped bothering with cover letters because they didn't make the faintest of differences. After many dozens of rejections I am just burnt out about writing them.
This fucking shit, which really boils down to a humiliation ritual focusing on why you """deserve to be here""", needs to fucking end. You are no more deserving than the applicant.
If you consider briefly highlighting the relevant parts of your experience to a potential employer as "fucking shit", then perhaps you are unsuitable for the role being offered.
I am sure, you hold the employer to the same standards. For example disclosing salary range information beforehand, or writing rejection letters afterwards, so applicants going through the trouble of doing their part aren't wasting their time and energy.
> I am sure, you hold the employer to the same standards
I most certainly do!
Incidentally, the very idea of not providing a salary range is truly baffling. I'm amazed any such advertisements generate applicants; other than those phoning up the HR department to tell them to stop pissing about and please state the salary.
> I think it is insane to expect that when you offer something for free (or very cheap), and it requires work and patience, that most people will follow through.
Just look at gym memberships. Apparently over 2/3 of people never use it and only about 20% use it regularly. Are the gyms also to blame for that? I don't think so.
This is why I love that ChatGPT added branching. Sometimes I end up going some random direction in a thread about some code and then I can go back and start a new branch from the part where the chat was still somewhat clean.
Also works really well when some of my questions may not have been worded correctly and ChatGPT has gone in a direction I don't want it to go. Branch, word my question better and get a better answer.
Some good examples, but much of their true crime don't fit in that list. The Monster series is pretty good, but most other true crime show are extremely repetitive and spend 4 episodes telling a story that could've been a single episode.
I'm really interested in what will happen to the economy/society in this case. Knowledge workers are the market for much that money is being made on.
Facebook and Google make most of their money from ads. Those ads are shown to billions of people who have money to spend on things the advertisers sell. Massive unemployment would mean these companies lose their main revenue stream.
Apple and Amazon make most of their money from selling stuff to millions of consumers and are this big because so many people now have a ton of disposable income.
Teslas entire market cap is dependent on there being a huge market for robo taxis to drive people to work.
Microsoft exists because they sell an OS that knowledge workers use to work on and tools they use within that OS to do the majority of their work with. If the future of knowledge work is just AI running on Linux communicating through API calls, that means MS is gone.
All these companies that currently drive stock markets and are a huge part of the value of the SP500 seem to be actively working against their own interests for some reason. Maybe they're all banking on being the sole supplier of the tech that will then run the world, but the moat doesn't seem to exist, so that feels like a bad bet.
But maybe I'm just too dumb to understand the world that these big players exist in and am missing some big detail.
reply