Innovation doesn't go for the sake of innovation itself. Innovation should serve a purpose. And the purpose of having programming languages is to overcome the limitations of human mind, of our attention span, of our ability to manipulate concepts expressed in abstractions and syntax. We don't know how long we'll need this.
I really like Zig, I wish it appeared several years earlier. But rewriting everything in Zig might just not have practical sense soon.
I agree that programming languages will no longer need to be as accessible to humans.
However there is still a strong argument to be made for protections/safety that languages can provide.
e.g. would you expect a model (assuming it had the same expertise in each language) to make more mistakes in ASM, C, Zig, or Rust?
I imagine most would agree that ASM/C would be likely to have the most mistakes simply because fewer constraints are enforced as you go closer to the metal.
So, while we might not care about how easy it is for a human to read/write, there will still be a purpose for innovation in programming languages. But those innovations, IMO, will be more focused on how to make languages easier for AI.
> would you expect a model (assuming it had the same expertise in each language) to make more mistakes in ASM, C, Zig, or Rust?
"assuming it had the same expertise in each language" is the most important part here, because the expertise of AI with these languages is very different. And, honestly, I bet on C here because its code base is the largest, the language itself is the easiest to reason about and we have a lot of excellent tooling that helps mitigate where it falls short.
> I imagine most would agree that ASM/C would be likely to have the most mistakes simply because fewer constraints are enforced as you go closer to the metal.
We need these constraints because we can't reliably track all the necessary details. But AI might be much more capable (read — scalable) in that, so all the complexity that we need to accumulate in a programming language it might just know out of the way it's built.
I’m going to assume you’re open to an honest discussion here.
> "assuming it had the same expertise in each language" is the most important part here, because the expertise of AI with these languages is very different.
You are correct, but I am trying to illustrate that assuming some ideal system with equal expertise, the languages with more safety would win out in productivity/bugs over those with less safety.
As in to say that it could be worth investing further in safer programming languages because AI would benefit.
> We need these constraints because we can't reliably track all the necessary details.
AI cannot reliably track the details either (yet, though I am sure it can be done). Even if it could, it would be a complete waste of resources (tokens).
Why have an AI determine the type of a variable when it could be done in a deterministic manner with a compiler or linter?
To me these arguments closely mirror/follow arguments of static/dynamically typed languages for human programmers. Static type systems eliminate certain kinds of errors and can produce higher quality programs. AI systems will benefit in the same way if not more by getting instant feedback on the validity of their program.
Yes, I get your point and I think your arguments are valid, it's just not the whole story.
The thing about programming languages is that both for their creators and advocates a significant part of motivation to drive is emotions and not the rational necessity alone. Learning a new programming language along with its ecosystem is an investment of time and effort, it is something that our brains mark as important and therefore protected (I'm looking at Rust). Now when AI is going to write all the code, that emotional part might eventually dissolve and move to something else, leaving the question of choice of a programming language much less relevant. Like the list of choices Claude Code shows to you in planning mode: "do you wish to use SQLite, PostgreSQL or MySQL as a database for your project?" (*picking the "Recommended" option)
That said, I hope that Zig will make it to version 1.0 before AI turns all the tables and sweeps many things away. It might be my bias and I'm wrong and overestimating the irrational part, then I'll be glad to admit my mistake.
I call it "the ability to communicate intent [using a programming language]" and suddenly building with AI looks at lot more like the natural extension of what we used to do writing code by ourselves.
At first I thought it's about turning off settings that allow me to watch garbage TV shows (or garbage ending seasons of initially decent TV shows in this case)
MCP is already deterministic. What's huge about it is that it has automatic API discovery and integration built-in. It's a bit rough yet but I think we will only see how it's getting improved more and more.
You really want to use WSDL? OpenAPI v3 would be a much better fit. But it has a tone of features that are completely unnecessary for this use-case. What if we just stripped it down to input and output json schemas? Oh wait ... we just invented MCP.
I don't think we should be surprised by this. A creature that needs to operate its body in 3d environment, perform complex manipulations with objects, participate in social interactions, probably use some sort of planning to optimise pollen harvesting activities has very good chances to be acquainted with the concept of time in one way or another.
What is indeed fascinating is how scientists invent all these experiments
> you can run out of space at one partition, but have lot of free space at another
that's exactly the point — you can run out of space in your /home but that does not affect, for example, /var. or vice versa, log explosion in /var is contained within its own partition and does not clog the entire filesystem.
very important for /var/log, pretty easy to have log spamming app fill the drive, and you don't want logs filling get your database into out of disk space state
One could use a trigger for this. All we need is to setup a trigger that would delete all expired records looking at some timestamp column on update. That would eat up some latency but as was said, most projects would find it good enough anyway.
reply