> Let’s rip the Band-Aid off immediately: If your underlying business process is a mess, sprinkling "AI dust" on it won’t turn it into gold. It will just speed up the rate at which you generate garbage.
In the world of Business IT, we get seduced by the shiny new toy. Right now, that toy is Artificial Intelligence. Boardrooms are buzzing with buzzwords like LLMs, agentic workflows, and generative reasoning. Executives are frantically asking, "What is our AI strategy?"
But here is the hard truth:
There is no such thing as an AI strategy.
There is only Business Process Optimization (BPO).
This is well-expressed, and almost certainly true for an overwhelming majority of companies.
A similar observation commonly comes up related to software development - "it's not tech debt, it's org debt" (or to put a different way, "trying to use a technical solution to solve a social problem").
I hear that one a lot but pretty frequently it's applied to "social problems" which were caused by technology. It seems to imply some kind of technology/society boundary which doesn't actually exist.
The saying "you can't solve social problems with technology" usually means - at least in the places I have heard / used it - "If your workforce fights a process - be it for the process being stupid, tools being slow, incentives do not align with policy, whatever - especially a control step, no amount of mandatory tech enforcement of that process step will yield better results." At best you get garbled data because someone hit the keyboard to fill in mandatory fields, sometimes, the process now works OUTSIDE of the system by informal methods because 'work still needs to be done', at worst, you get a mutiny.
You have to fix the people(s' problems) by actually talking to them and take the pain points away, you do not go to 'computer says no' territory first.
In my experience, no org problem is only social, and no tech problem is merely technical. Finding a sustainable solution in both fields is what distinguishes a staff engineer from a junior consultant.
I work on SaaS platform as engineer. We can have some people from customer A asking for bunch of fields to be mandatory - just to get 6 months later people from that company nagging about the fields saying our platform sucks - well no their process and requirements suck - we didn’t come up which fields are mandatory.
I've been thinking a lot about that lately, and I agree. I used to be hard in the "You can't solve social problems with technical solutions", but that's not the whole truth.
If people aren't using your thing, sure, you can brand that as social problem (lack of buy-in on the process, people not being heard during rollout, ...). However one way of getting people to use your thing/process is to make it easier to use. Integrate it well into the workflow they're already familiar with, bring the tooling close, reduce friction, provide some extra value to your users with features etc.
That's technical solutions, but if you choose them based on knowledge of the "social problem" they can be quite effective.
Oh, now I have a name for the epidemic pervasive through our company.
Almost all of the tech debt we have was introduced by leadership guidance to ignore. And all additional debt to manage it or ameliorate it (since problems don't just go away) is also guidance from leadership to fast track fixes.
What happened to the days where software engineers were the experts who decided tech priority?
That is *IF* there ever is acknowledgment that things need to be fixed.
"These pesky issues, who knows where they come from, just quickly get it out of the way. We have the next shiny new to tackle for the next quarter, and we better finish it quickly".
> What happened to the days where software engineers were the experts who decided tech priority?
Outside of a very small number of firms that were called out as notable for being led in a way that enabled that, often by engineers that were themselves still hands on, they never existed, and even there it was “business leadership that happened to also be engineers, and made decisions based on business priorities informed by their understanding of software engineering”, not “software engineers in their walled-off citadels of pure engineering”, and it usually involves, in successful firms, considerable willingness to accept tech debt, just as business leadership can often not be shy about accepting funancial debt.
> business leadership can often not be shy about accepting financial debt
Business leadership is not shy about accepting financial debt when business leadership has decided it should accept financial debt. Technical leadership should ostensibly not be shy about accepting technical debt because business leadership has decided it should accept technical debt. The distribution of agency and responsibility in the two situations is different.
Pedantically, true "tech debt" does exist. Usually it had resulted from "org debt", which since then mostly ceased to exist in its original form, but the tech debt remains.
I worked for a fortune 300 company that engaged in a Business Process Redesign initiative. After spending 90 million on the project they pulled the plug.
My takeaway was that the project was doomed because it was named wrong. Should have been called Business Process Design.
They are now owned by Private Equity. I can only wonder what madness the would have wrought with AI.
They tried to implement a system whereby a customer has a single customer number. Between mergers, acquisitions and shutdowns it was impossible to keep straight and keep tracking history. It impacted rates, contracts, sales commissions, division revenue-everything. In they end they gave everyone a new number while still using the old ones.
Almost every problem a modern corpo has can be solved with an appropriate head-count of appropriately trained/educated people, and that's why none of them get solved.
The processes suck because of decades of corner cutting and "fat" trimming while the executives congratulate themselves for only making the product a biiiit worse in exchange for a 0.0005% cost reduction, before then offsetting any gains by giving themselves all the money that would've gone to whatever is now dead.
Repeat this process for 30 years and you have companies like Microsoft that can barely ship anything that works anymore, and our 4 Big Websites frequently just fail to load pages for no explicable reason, Amazon goes down and takes 1/3 of the internet with it, and AI companies are now going to devour the carcass of our internet and shit it back to us in LLM waffle while charging us money for the privilege to eat it.
It's The Mythical Man Month idea. Programming software is a different thing than working on an assembly line, or a call center, or in retail sales. You're much better off having four programmers who are worth paying $200k a year than ten programmers who are worth paying $75k a year.
I'm going to argue that, at scale, process beats the quality of the people you're using -- and also that there are toxic cultures, around Google and C++, where very smart people get seduced into spending all their time and effort fighting complexity, battling 45 minute builds, etc.
> and also that there are toxic cultures, around Google and C++, where very smart people get seduced into spending all their time and effort fighting complexity, battling 45 minute builds, etc.
Not sure what you mean here. "Fighting" as in "seeking to prevent", or "putting up with", or what exactly? Is this supposed to be bad because it's exploitative, or because it's a poor use of the smart person's time, or what exactly?
Essentially that the idea that people can hold 7 + 2 things in their head simultaneously is basically true such that when your tools make a demand on your attention it subtracts from the attention you can put on other things.
There are many sorts of struggle. There is struggle managing essential complexity and also the struggle, especially in the pre-product phase, of getting consensus over what is "essential" [1] When it comes to accidental complexity you can just struggle following the process or struggle to struggle less in the future by some combination of technical and social innovations which themselves can backfire into increased complexity.
Google can afford to use management techniques that would be impossible elsewhere because of the scale and profitability of their operations. Many a young person goes there thinking they'll learn something transferable but the market monopolies are the one thing that they can't walk out with.
[1] Ashby's law https://www.edge.org/response-detail/27150 best exemplified by the Wright flyer which could fly without tumbling because it controlled roll, pitch and yaw.
Honestly, I don't know if throwing people at a problem is the way to go. Doubly so given that a good chunk of the projects lately for me deal with third party vendors and those are so .. embedded that even getting basic requirements, documentation is an uphill battle ( which -- to me -- seems insane ). I have zero pull so I do what I can, notate the insanity for cya and move on.
I do agree on execs congratulating themselves afterwards though. It was obscene last year. This year it was mildly muted.
Throwing people at a problem is very different from allocating an appropriate head-count of appropriately trained/educated people. A small but skilled team can accomplish a lot, whereas a lot of the wrong people can't do much at all. Generally there are more than enough warm bodies available, big companies are full of those, the issue is that skilled people aren't fungible - the team of 12 working on this project seems to be moving at a snails pace because really it's two people, both of whom are split across several other projects simultaneously, doing the real work and everyone else doing stuff that is likely unnecessary if not straight up counterproductive. It takes skill, effort and discipline to cultivate a team that actually has all the skills it needs to succeed, in the form of people who mutually work well together, to keep these people around over an extended period of time, and not try to split them up onto different projects and plug the gaps with the wrong people.
Sorry, but the multiple colleagues I've lost through multiple rounds of layoffs were not simply "people thrown at a problem". They were helping keep the lights on, which is now my responsibility for no increase in pay.
> Almost every problem a modern corpo has can be solved with an appropriate head-count of appropriately trained/educated people
Not really, because solving those problems with headcount defeats the point. Part of the definition of those kinds of problems is that solutions involving headcount are invalid.
yep, in fact, I did use LLMs to formulate the text (I used several one, t'il I got the result that I wanted) and tweak some parts of it manually - Here are the prompts that I used:
"it's always the process, stupid!
In the context of business IT, it is always and only about BPO, nothing else. so if you want to be successful implementing AI in the enterprise context, you have to handle it like every other software tool: look at your processes and find out how AI, agentic or not, can help optimize the process or help build a better process. Like every technology, AI dont make more intelligent, it only makes faster
write me a blog post to express this ideas"
and i added this in a second step
"i need to integrate another idea:
AI is very good at handling unstructured data, in fact, it's the first tool that is very useful for that. But most processes that uses unstructured data are not documented, as they are also often unstructured. So to implement AI in processes, we must improve our process design"
Fact: when you know what you want to achieve, AI can be very useful, especially for lazy people like me ;)
There was a guy who wrote a blog post in that style who was wondering how it was he'd posted hundreds of messages to people on LinkedIn and gotten no replies.
There are some people who insist on spamming out splog posts in that style, some of them think they are blogging, not splogging, and maybe they have good intentions but that style screams "SPAM!" and unfortunately people who are writing that don't understand how it comes across.
I feel like this is an inside view from the BPO community and the only part of AI they see is the part that affects BPO. But for most businesses AI strategy is not about AI for internal use but AI to either improve customer funnels or launch new products. Most of the companies I've talked to in the past year wanted a strategy for customer facing AI not internal AI.
The last time a company put an AI chat bot between me and actual customer service it didn’t listen to my problem and hallucinated something I didn’t say.
It can both be Business Process Optimization and an AI strategy.
In fact, if an AI strategy becomes business process optimization, I'd say that AI strategy for that company is successful.
There are too many AI strategy today that isn't even business process optimization and detached from bottom line, and just being pure FOMO from the C suite. Those probably won't end well.
In the world of Business IT, we get seduced by the shiny new toy. Right now, that toy is Artificial Intelligence. Boardrooms are buzzing with buzzwords like LLMs, agentic workflows, and generative reasoning. Executives are frantically asking, "What is our AI strategy?"
But here is the hard truth:
There is no such thing as an AI strategy. There is only Business Process Optimization (BPO).
This is well-expressed, and almost certainly true for an overwhelming majority of companies.