Well before AI co-pilots something happened to the good old admins--they started to disappear only to be replaced by "AWS devops" (their job titles) who have never wired a network using routers, switches, and cables. I noticed that they started lacking basic networking knowledge and couldn't set up networking inside AWS. They just didn't know what a gateway, NAT, or subnet is.
Similar things are happening with AI co-pilots. We have an increasing number of people who "write code", but the number of people who can understand and review code is not increasing. There is also a problem of injecting ethics and politics into those tools, which can produce silly results. I asked Bard to write me a Python function to turn the US Constitution into a palindrome. Bard refused and gave ma a lecture on how the US Constitution is too important to be played with in such trivial fashion. I then asked it to produce code that turns the US national anthem into a palindrome, it refused again. So I asked it do the same but with the Russian national anthem and it spat out code without telling me off. I then asked it to generate code for simple tasks and it did an OK job, except the formatting and the fonts used were different every time, because it just lifted code from different webpages and recombined it like a massively hungover student waking up to realise he's supposed to hand in the assignment in one hour.
> Well before AI co-pilots something happened to the good old admins--they started to disappear only to be replaced by "AWS devops" (their job titles) who have never wired a network using routers, switches, and cables. I noticed that they started lacking basic networking knowledge and couldn't set up networking inside AWS. They just didn't know what a gateway, NAT, or subnet is.
I see the same arguments about higher level languages. All of the tech industry is about standing on the shoulders of giants. I do believe that lower level knowledge can help a great deal but using this argument I can say something like:
Kids these days don’t know basic assembly, they have no idea what an or/and/nor/xor gate is, they’ve never built a computer from components (no, not cpu/mb/ram/etc, I’m talking transistors and soldering).
Maybe LLMs are different but I don’t think they are, they are yet another tool that some people will abuse and some will use wisely. No different from an IDE or a higher-level language in my book.
I have next to zero idea how many things I use daily work (like my car for example), it doesn’t stop me from being able to drive where I need to go. I have configured lower level networking equipment in the past and I couldn’t be happier that I don’t have to do that drudgery anymore.
All I’m saying is this is a dangerous argument to make because someone can always one-up (one-down?) you. “Oh you had NAT? Luxury! In my day we didn’t even have a network, we had to….” (See also: Four Yorkshiremen [0])
I do not wholly agree with you. Just to pick on one thing in your reply...
> Kids these days don’t know basic assembly
It is a problem. Knowing assembly is crucial if we want to have more efficient, faster hardware. Someone has to write kernels and drivers. The efficiencies happening on the lower level of the stack make the upper layers of the stack faster, more stable. It is also important that we have a large pool of devs who know how to do it. Otherwise that knowledge will be taken in-house and shared under strict NDAs preventing devs working for competition and effectively killing all the efforts of the Open Source community to wrangle free access the computer out of the claws of corporations.
This is not a problem that AI has brought about first, but co-pilots are accelerating it.
Just to provide some context, I could not wait for 16-bit, 32-bit, and then 64-bit architectures to become popular. I was the happiest dev around when object oriented programming tools became available, same to Linux, web, all frameworks and even cloud computing. But AI as it is currently being sold is a shit tool. If it stays and doesn't disappear after the VC money runs out, we will have to adapt our ways of working and educational courses to teach devs to review work of others recompiled by AI (it can't write shit on its own). Not sure it is worth the effort, because every time a new framework is introduced the results generated by AI will be of low quality until the models get trained on exaples written by humans... What's the point of using AI in that case?
Agree here. As a 90's kid I dove into computers, Internet and programming head first. I went into the field out of passion, not money. I received an engineering degree.
If I was the same kid now, I'd get into a bootcamp or self teach and get a web dev job because the money is lucrative. Yeah sure, maybe i'll dabble in foundationa litems here and there, but most won't. It doesn't advance your career in web/software.
IMHO we will see a brain drain in computer/electronics engineering in 20 years when the pipeline of graduates dwindle and those with experience retire. Hope LLMs can innovate enough to make up, I guess.
> If I was the same kid now, I'd get into a bootcamp or self teach and get a web dev job because the money is lucrative. Yeah sure, maybe i'll dabble in foundationa litems here and there, but most won't. It doesn't advance your career in web/software.
That sure sounds like it has nothing to do with LLMs and everything to do with factors not all related to even computer science. “Capitalism” or maybe the more trendy “late-stage capitalism” is probably more to blame here than anything else.
Also I reject the idea that there are no kids today that get involved in tech/computers for the joy of learning/exploration.
I bristle whenever I see arguments about “kids these days didn’t learn exactly how I did so it must be wrong”. It reeks of “old man yells at cloud” (note, I was also a 90’s kid). It’s the same BS I heard as a kid about how computers would rot your brain or they were horrible for <insert stupid prediction that didn’t come true>.
I agree that it reeks of old men yelling at clouds but there is data driven precedent here: with the advent of smartphones, computer literacy in education has dropped like a rock. It isn’t just kids these days, it’s a genuine shift in computing across the entire populace that happens to really effect the funnel for programmers at the entrance to that funnel. Copilots are an extension of that development, not necessarily something wholly new.
I’ve heard this anecdotally from every educator I know at the high school and college level. CS students are increasingly entering without any computer proficiency whatsoever so it has to be built up from scratch.
Yep, despite all the talk of young people being "digital natives", installing apps from Android/iOS app store, taking pictures and chatting with friends doesn't teach you much that is valuable for office work.
PC gaming and specifically modding is what I can thank for much of my computer proficiency, but even that is easy these days with Steam and its workshops for many games.
Of course you’ve now effectively circled back to the early 1980s at some level when there was not a lot of expectation that college freshmen had touched a computer much less were proficient even if the wanted major in CS/EE.
I think the apt analogy here isn't "old men yelling at clouds" but the historical way that one certain previous generation become more adept than those before or after at working with a certain then-new technology -- thus becoming a uniquely skilled generation in that technology, moreso than their elders or their youngers, never to be repeated again.
I'm not talking about millenials/gen X with computers -- I'm talking about silent gen/(boomers/greatest gen?) with cars.
In both cases, a new technology dominated the world, but was new and brittle at first. The kids who saw it in their youth were fascinated, but because of its brittleness they had to become at least minor experts in at least minor troubleshooting and mainteanance (like tuning a carburetor or defragging a hard disk) just to access the coolness.
I mean, it really was a pain to be a car owner in 1960. A lot like being a computer owner in 1995. If you wanted to enjoy one, you were going to need to change a spark plug or registry value once in a while. And you had to be ready to recover from an engine overheating or a blue screen of death, because these were not rare events.
Then the future generations increasingly lost touch with those skills because the technology got smoothed out as it developed further.
So I think it's quite plausible that future generations will permanently have less interest in serious skills with computers since the same thing has happened with cars. There is a way smaller perecentage of people my age (millenial) who are "into" cars or have moderate familiarity with car repair than people in their 70s. So it seems plausible the same pattern could continue to play out with computers, rather than a "old man yells at cloud" illusion
> Knowing assembly is crucial if we want to have more efficient, faster hardware
this is a goal for maybe 2% of the industry. And those folks know their assembler or they don't get hired. I think we are producing skills in the proportion of its requirement in the industry.
I hate to dash your hopes of never seeing LLMs write code again but Llama3 70B is not terrible as a copilot and it’s small enough to run on comparatively cheap hardware or even consumer hardware when quantized plus it’s publicly available so from now on there will always be some form of copilot service or tool that will be in use, that ship has sailed. The good news is that it doesn’t look like we’ve maxed out the capabilities of GPTs yet so as more are trained and compute becomes cheaper and as we produce and accumulate more data they should keep getting better for a while at least.
> Not sure it is worth the effort, because every time a new framework is introduced the results generated by AI will be of low quality until the models get trained on exaples written by humans... What's the point of using AI in that case?
LLMs will not replace us, they just take some of the tedium out of writing code. Yes, they will have to be trained on new frameworks and languages but I don’t see that as a problem. People have to wait for SO questions and blog posts to be written about new frameworks and languages as it stands today.
I’m not sure I agree with your arguments about needing a lot of devs to understand assembly or the like. Or rather I disagree with your prediction of where that leads. We are effectively already there and it hasn’t led to your future. We have an amazing open source ecosystem and that billions of people rely on daily without knowing how every part of it works. Open source has only grown during that time.
I’ve never done the math myself for calculating the distance between 2 points on a globe. I just grab the Haversine formula and keep going. I don’t see that as a problem. Similarly I’ll reach for an open source library to solve a boring problem instead of doing it myself. I quite literally see no difference in that and using an LLM.
> Kids these days don’t know basic assembly, they have no idea what an or/and/nor/xor gate is, they’ve never built a computer from components (no, not cpu/mb/ram/etc, I’m talking transistors and soldering).
At each layer, though, we have a legible formal theory bridging one level to the next, and the abstractions tend to be so reliable the last thing most of us will consider is a hardware/assembler/compiler bug (though they do occasionally happen).
LLMs bridge layers illegibly and probabilistically. I can already see how they're helpful, but they do have some different characteristics that mean reliability and the connection to lower levels of the stack are an issue.
> LLMs bridge layers illegibly and probabilistically.
I’m not sure I agree but I want to make sure we aren’t talking past each other.
Using an LLM to write code does not have this problem IMHO.
Using an LLM as part of the code does have this problem.
I’m not writing off the usefulness of calling out to an LLM from code but it does bring “illegibly and probabilistically” into the mix whereas using an LLM to write code that’s human-reviewed/modified does not.
In fact, you should be unable to tell what I wrote vs what the LLM wrote if I’m doing right (IMHO). “Clever” code is always bad no matter who/what wrote it.
The only time I’ve been asked “did you use <insert LLM> to write this?” Was due to the speed at which I completed something or because it’s in a language that the person asking the question knew I didn’t know very well.
LLM's can be good at guessing the code you want, but I've seen them guess the comment or documentation correctly, then the implementation is the opposite of what you asked. It also generated tests which were for the incorrect case, with comments showing the opposite of what you wanted.
For somebody new at the language or new to programming in general, that difference can be imperceptible until it gets to a review.
I've also managed to spot people using an LLM to write stuff due to them doing things which won't pass linters or code tests.
The subtle differences between comment/code can absolutely be a problem, I’ve fallen prey to it as well but as long as you test/execute the code you’ll catch it if you didn’t catch it from reading over the code. I often reject anything overly clever or that I can’t reason out quickly when it comes to LLM generated code.
As for not passing linters or code tests that’s unacceptable behavior from an employee. If that happened more than a few times I’d be having a talk with the employee followed by showing them the door if they continued to do it. I’d behave the exact same way if no LLM was involved.
I do wish LLMs like Copilot could use the lint rules and/or tests to inform the code they generate but thankfully it’s normally a keystroke away for me to reform a block of code to match the style I prefer so it not the end of the world.
To use LLMs (or any tool) you have to be able to show you can use it correctly, safely, effectively, etc. If you can’t do that then you have no business being in a coding role IMHO.
Points for a great twisting of the metaphor I used, I quite like it.
I agree _someone_ has to have that knowledge, I reject they _everyone_ must have that knowledge.
I have zero desire to learn more about logic gates, they don’t interest me and while they enable me to do the work I do I don’t need to know about them to do the work. The same way that I couldn’t tell you how my car engines works (ok, at a high level I could, probably about the same level as I could talk about logic gates) but I’m able to drive from point A to point B without issue.
There are people who enjoy assembly, let them work on it. I’m not one of those people and I don’t have a problem with those people but I do have a problem with people who think I must know assembly to be an effective developer.
> I have zero desire to learn more about logic gates
Sure, but you know that they exist, which to me suggests that you learned about them somewhere and probably have at least some foundational understanding of how they work
The problem is when people can skip any of that broad foundation and go straight to specializing somewhere, a lot of general knowledge is lost
Part of learning the broad foundations is also learning that those options exist for a specialization path
Someone who may love working on hardware designs may wind up spending their life as a mediocre web developer because they took an AI assisted web bootcamp that skips learning important fundamentals
It's an engine, and can make all kinds of things move.
If you don't know it works, sure you can drive a car, but a lot of missed opportunities to leverage the same exact tech for other use cases disappear when you don't know fundamentals.
Interesting, the college I went to (2009) didn’t teach it at all. C++ was the lowest level they taught and they switched to Python after I left (as in they use it instead of c++ last I heard).
If I ever needed to write assembly (which probably means something has gone horribly wrong in my life) then I’m sure I could learn it on the fly, just like I’ve learned other things never taught in college on an as-needed basis.
This argument falls apart for me personally. I'm sure we could go back a decade or two and find someone saying something like:
It’s quite alarming to think that the most advanced language many new software engineers are familiar with is merely C or C++.
Imagine graduating without ever having programmed in assembly. It’s not that C or C++ aren’t capable languages, but we’re definitely in a strange era when the deeper understanding of machine interaction through lower-level programming is becoming a rarity.
I don't think that's quite the same, but in general I agree with that sentiment, even though you weren't serious about it.
Folks should know their basic C/C++ just like they should know enough x86 to make sense of what godbolt.org tells them. But oh, well, that's probably a lonely hill to die on. Having a rough idea of all these levels of abstraction work and interact is something I profit a lot in my line of work, but that's just a small niche and I get that (profiling and optimizing scientific, signal processing routines).
I consider assembly to be a basic skill like arithmetic. Most of us have a calculator on hand 24/7, but being able to look at a clock and say 'I've got 12 minutes' is useful.
I had to take an "operating systems" class in Java 15 years ago, luckily i already knew enough about C and assembly to know it's a waste of time. the professor thought it was more important to teach "fundamentals" than low-level programming.
i got a c in that class (the grade not the language) because i didn't know java and kept having difficulties with the way it forces you to shoehorn OOP into everything.
I don't remember seeing a class on it but it was a long time ago. I also don't remember it being required to get a degree. I just looked on my college's website and they offer 2 assembly classes it looks like, I'm not sure if they existed when I went there (I assume they did) and neither looks like it's required but I don't have the time or energy to dive back into that kafkaesque hellscape of figuring out what's required to graduate.
However, I didn't get a degree so maybe it was shoehorned into some high level course. I had a job writing software while I was in college and dropped out to do that full time once I felt I wasn't getting any value from continuing my "education". It's never been an issue in my career and I've done well by my own metrics at least.
Yeah my husbands college (I'm a SW eng, he wants to be one) teaches C++ as the base...and this is a "just" a state school in a Arkansas.
Also gotta correct myself, after asking him it looks like I was wrong. Assembly isn't a direct requirement but it's one of three electives you have to chose from.
You have to choose between either assembly, cybersecurity or data science and assembly is considered the easiest apparently.
I feel comparing knowing assembly to knowing about NAT/Subnets/Gateway is a false equivalency.
Outside of a small set of folks most devs need NOT know assembly due to compilers being very good at converting to assembly.
Not the same Subnet/Gateway/NAT. These are first class constructs in most cloud developments and has huge implications on security, performance and costs. Unless you have some kind of automated way that abstracts away these issues (akin to compilers) most devops folks would need to learn about these concepts while working in a cloud environment.
I think there may be a difference in kind here than with a lot of previous labor-saving abstractions.
To use an LLM, you still need to know the language you are using it to write. You don't need to know C or assembly as a Python/JS programmer today, but you do need to know JS to be an LLM-heavy JS programmer.
> You don't need to know C or assembly as a Python/JS programmer today,
I agree, not everyone does. Maybe we have progressed to the point we can all say those people are wrong and their arguments were always wrong, maybe not.
> but you do need to know JS to be an LLM-heavy JS programmer.
I also agree. That's my whole point, you do need to know the language to be able to effectively use an LLM.
My issue is with people who immediately reject LLMs as a net-bad because they do part of the job for you. I reeks of "well I had to do it the hard way so you do too" which I cannot abide and immediately flips the bozo bit on them for me.
Okay, but if you don't know what the JS is doing and why how are you ever going to architect stuff efficiently or think up more complex things in the abstract? I fail to see how you can just know HTML and CSS, that's like saying you have the alphabet down but somebody else can speak or write for you. HTML and CSS are just markup and style languages, it isn't really code.
CoPilot is just blundering along using what it has learned from elsewhere - which is not always correct.
I'm very impressed with the best models — but, because I remember how awful NLP used to be, "very impressed" still means I rate the best as being around the level of an intern/work placement student most of the time, and even at their best still only a junior.
It's great, if you're OK with that. I've used GPT-3.5 to make my own personal pay-as-you-go web interface for any LLM that's API-compatible with OpenAI despite not being a professional Web Developer.
It's fairly fragile because it's doing stupid things to get the "good enough" result.
Bu that's OK, because it's for me, I'm not selling it.
(As for the lesser models… I asked one for a single page web app version of Tetris; it started off lazy, then suddenly switched from writing a game in JavaScript into writing a machine learning script in python!)
> They just didn't know what a gateway, NAT, or subnet is.
If you ask me, they shouldn't have to, in the same way that you probably don't know how to diagnose a coax cable problem.
The major public clouds all have completely software-defined networks (SDNs), and concepts like subnets are basically emulated for the sake of legacy systems.
Why would we want subnets, like... at all? Why can't all systems just have an IP address and use point-to-point communications? That's literally what the underlying SDN does with the packets anyway! The subnets and routes you see don't map 1:1 to the underlying network. The packets are stuffed inside a VXLAN packet and routed between random hypervisors scattered randomly across multiple buildings. Just give up the pretence, give everything an system-assigned public-routable IPv6 address and be done with it. Better yet, auto-register everything in a private DNS zone and auto-generate internal use SSL certs too.
The amount of wiring we do manually is just insane, just so that Azure and Amazon can help greybeards pretend that they're operating a Cisco router and need to stand in line to file the paperwork required to get an SSL certificate.
> you probably don't know how to diagnose a coax cable problem
No, that was definitely part of the sysadmin job description ~25 years ago, both line discipline and physical medium problems (I had a TDR and a fiberglass fish at my desk for my first sysadmin job and was kind of bummed to find out that isn't remotely normal anymore)
Problem is that all those systems are proprietary to individual clouds.
Congratulations folks! We undid several decades of progress and went back to the 1960s and 1970s when software can only run on the specific mainframe for which it was written.
Which is why I don't write for the cloud. My personal stuff goes into a dedicated linux server setup with docker compose, and my work stuff is all hosted on windows servers.
I'd prefer switching those windows servers to Linux, but we're too integrated with Microsoft at this point.
It’d be easier to swap those Windows servers out for Linux than it is to migrate the typical cloud native application from one cloud to another.
The cloud is the most closed, most locked in, most rent extraction oriented computing paradigm since the pre-minicomputer mainframe era. It’s really a return to that era.
Computing tends to cyclically reinvent wheels. I suppose next we will reinvent minis, micros, LANs, etc.
I think you’ve stumbled on the solution there - *something* happened between the 70s and today to address that thing from the 1960s, and it was motivated by capital-seeking. There’s a high probability that it will happen again, and if it does not then it’s likely the market didn’t demand it. Or, the world is just one big corporation and we’re all screwed. One of those.
The problem with that idea is that it requires every router to know about every system on the network. Unless they abandon Ethernet and replace it with MPLS source routing, which maybe they should. You'd still have a maximum subnet size, because the hardware firewall that would still have to be in front of every server would have a limited capacity.
I've noticed that ChatGPT will speed up the monotonous part of writing code, but will often inject a few bugs into the code that I need to fix. In a way, ChatGPT has become my personal junior developer (which I only use for basic stuff) while I'm still dependent on my own expertise to review and fix the code, but at least it's still faster than using google and writing most of it myself. Juniors will always have bugs in their code, but when you bypass the step of writing the code and figuring it out for yourself, how do you effectively learn and become a better developer?
In a way, ChatGPT both replaces the need for junior developers and creates a situation where you'll end up with a developer shortage because not everyone can start right away as a mid-senior level.
This is definitely appealing to my curmudgeon heart. But I translate it into my corner of the world and the equivalent is everyone should know c. We don’t need everyone to know c.
Everything else being exactly equal would I like someone I work with to know c over not? Sure.
But would I take a world with many fewer programmers that all knew c over the one we have today—definitely not. It would be a significantly poorer world.
Your mentioning of AWS devops reminded me of a story my dad, a mechanical engineer, told me. He travels a lot and works with engineers all over the world. He was talking to me about how the new "wave" of engineers (mainly in Canada and America) that he works with there is a a complete lack of people that know how to do things like manually calibrating various parts by hand.
He's had to completely rethink how they approach their industrial designs and build machines and transition to much more electronic based things with fewer moving parts, and build things they can just adjust with computers and point-and-click GUIs, because people just don't know how these old things work from a physical reality perspective. My dad was very distraught by the caliber of engineers he's seeing in the "next generation".
But that's not the only thing.
In Canada, our aging population of tradespeople are retiring in droves and we have a dearth of younger people to replace them. It's so bad, that in my province they've completely redesigned the Grade 11-12 highschool curriculum to have "fast track" paths into trades where 80% of their time is spent in coop/apprenticeship (good thing!). We have a housing crisis and we have nobody that actually knows how to build houses in our country while the federal government imports 1 million people per year with nowhere for them to live. Most kids these days just want to be "influencers".
Spelling and grammar in young people is absolutely atrocious because they are totally dependent on autocorrect; and they can't type using physical keyboards because they only ever used a touchscreen keyboard on phones and tablets.
It's not just programming being destroyed by copilots. It's an overall dumbing down of our entire civilization.
I have the same issue with young software engineers. We used to install every Linux distro because it was fun. Junior are now scared of the command-line.
Another example that I see all the time: I recently had to review a pull request from a junior. 200 lines of very simple C++ doing almost nothing, you can't fail that. I had to write more than 50 suggestions and comments on very basic stuff that the guy should have learned when he started coding. Yes, they must learn the ropes, but it seems like the passion is not there. We used to be passionate about coding, and all I see is people who went to private college to make money but don't enjoy what they do.
I was a hobbyist programmer installing all sorts of Linux and BSD distros on my computers since the age of 10. My first real software job still had me go through a period where I had 50+ comments on some of my initial changes because life before work doesn't really prepare you for what you need to know on the job. If they aren't learning from your feedback and constantly making the same mistakes then I understand, but it's somewhat misguided to expect folks to simply know a lot early in their career.
Heck even folks coming in with experience, but from a company with a very different coding culture or simply a different primary language can result in the same situation.
Do you/they learn is the question. Not programming but a different role in tech but I’ve had a recent intern who did a very competent job and vacuumed up feedback and opportunities to learn. And I’ve had a longer tenure person who moved in from a different type of role at a different company who… did not. And just couldn’t appreciate how what they were creating was a net time sink for everyone else. They didn’t last.
Niche-ness isn’t a hard metric, but the argument is the same for “this is what happens when a profession hits a critical mass of people just in it for the money and not at all for the enjoyment of the profession.”
lol, i'm reinstalling a linux distro right now just because I want to try a new one
I work with seniors that are scared of the command line. I can somewhat understand why Microsoft might be trying to turn everything into AI Copilots because when I'm trying to explain to someone how to revert changes in a file with `git reset/checkout` they practically recoil in terror at the suggestion of using the terminal. They are married to their git guis and have no idea how it works. Best AI-ify as much as possible to keep people in their ecosystems.
But are they any less productive? My experience is no. I was a command line aficionado when I entered the workforce, but I saw several people get the same results as me in a purely GUI environment. I see the same today. They get shit done and that is what is required. I feel like older devs romanticize certain parts of their workflow too much.
They're not necessarily dumber (maybe in the sense that they would be less able to survive on a desert island), but rather the same neural circuitry is being repurposed for other things. If anything the environment is dumber and they're just better at efficiently navigating it than we are.
When I need to explain to other so-called seniors how to revert changes to a file with git checkout and they recoil in terror in using the command line, then yes - I believe they are actually getting dumber.
No. Most kids strive to follow whoever "mentors" and teaches them.
What is sad is that this job is left to influencers. Not the kid's fault if our education system, parenting and attention economy are completely f'd up.
> There is also a problem of injecting ethics and politics into those tools, which can produce silly results.
Well, they can, but the "no filter" option does not necessarily generate satisfactory results either. I find it hard to blame Google for not wanting to put their brand on an AI providing a ranking list of races by intelligence or whatever kind of offensive thing people are going to try to make it produce.
We are talking about a code generator. Dev tools ought to be neutral when it comes to politics and ethics otherwise you get a smart hammer that will refuse to nail Jesus to a cross. That's not how tools work.
Safety devices put into tools to prevent misuse or accidents are also not that unusual. Feel free to change my example to something like "a code snippet that will use the 23andMe API to prevent users with insufficient European ancestry from registering an account" if you want a more code-specific example of reputational risk from an unfiltered AI.
> and it did an OK job, except the formatting and the fonts used were different every time, because it just lifted code from different webpages and recombined it like a massively hungover student
Tangent, but it used different fonts? I can’t imagine how this is possible given how LLMs work under the hood (sampling likely tokens which represent Unicode character sequences - it doesn’t operate on rich text so afaik has no way of seeing or changing font information). Is there something weird happing with the frontend breaking the code/not-code formatting they do (as a cosmetic thing rendering the output)?
As constitution - changing one but not another could be a matter of cultural sensitivity as well (although I bet Bard isn’t that smart).
As a Polish person, I don’t mind you changing the constitution, but for example with our national anthemn we are way more attached to changing how it sounds than Americans are to theirs. Ditto where our flag and our symbol can be used.
As for Bard - did you try arguing with it that constitution guarantees you the freedom to play with it? I managed to convince gpt to do some weird stuff by arguing for cultural sensitivity.
What Bard couldn't understand was that what I actually expected it to do was to write (well, steal, these things are simply giving answers that look like plausible answers by recombining other people's work) a function that takes a long piece of text and turns it into a palindrome. It cannot deal with ambiguity and imprecision, but will refuse how to acknowledge that fact and will keep generating random answers like a public schoolboy who insists on winning an argument despite not knowing a thing about what it is he's arguing for or against. When I asked Bard a more precise question and asked for a function that takes a long piece of text and turns it into a palindrome it gave me a different implementation of it every time I asked the same question behaving like a kid who has no clue, but will give a random answer and watch the teacher to see if the teacher accepts it.
I will not argue with Bard or any other tool of that sort. I have better things to do. They are supposed to be good at translation, so I thought ok, let's try giving it a piece of my own writing in English and ask it to translate it into French (a gendered language). It could not infer from the text that the internal monologue that the protagonist was a female, so it translated it all using male voice. I asked it to switch gender to female voice but it stopped doing so after translating about two pages of the text and even on those two pages it kept switching to male voice. Then, after translating three pages it replied "I don't know how to do it" when I asked it translate another page, even though I used the same prompt I was using to ask it to translate previous pages. These are toys not tools. They are shit at their job and a waste of time and electricity.
Since you are Polish, I was told by my Polish colleagues that onet.pl, a large Polish news site uses AI to auto-translate articles from foreign media sources and that all of those translations are done in the male voice, even when the original text clearly describes what a woman did or said. My colleagues say it's been going on for a while and nobody bothers to fix these translations. If true, the future is here, it is shit, and AI is good at one thing that VCs love to fund, i.e. extraction of value through destruction of value.
I agree. The state of level of “basic” knowledge in developers are dropping.
The consequences are that developers can tackle basic tasks which are supported by the frameworks they use, but once something is not supported or straightforward they don’t know what to do and get completely stuck.
From society’s point of view, the usefulness and value of the task force decreases and important problems are not solved or aren’t efficiently solved.
They don't know what they don't know. (Devs not knowing where to start writing a piece of code that works on actual bytes and not strings or numbers is a classic example of this problem.) Sometimes they are good devs in their space and mean well, but they spend their working lives on a different level of abstraction and in a different problem domain. The danger with co-pilots is that devs using them will even not know what to ask for, because they will not know what it is they need to write. There is value in going through docs, writing your own code, making mistakes, seeing your code fail, fixing it. That's how you learn to spot mistakes made by AI, but I don't think that having a coding puppy and spending time fixing its mistakes is conducive to writing better code or improving developer productivity.
> We have an increasing number of people who "write code", but the number of people who can understand and review code is not increasing.
Something like this was happening with coding bootcamps as well. Lots of people who can code what they were directly taught but don’t have the basics to go outside of that.
Yes I agree. But good companies are aware of this fenomenon and these so called chatgpt programmers are considered juniors. There is always place for you as an experienced developer who understands memory management, io performance, stack executions, step debuggers, compile optimizers and much more.
> because it just lifted code from different webpages and recombined it like a massively hungover student waking up to realise he's supposed to hand in the assignment in one hour.
That’s CTO level coding right there, gotta tell bard about my great app idea.
> I asked Bard to write me a Python function to turn the US Constitution into a palindrome. Bard refused and gave ma a lecture on how the US Constitution is too important to be played with in such trivial fashion.
Jesus, they have coded pearl-clutching and finger-wagging lectures in my computers. You know, those machines that used to be brutally efficient at doing what you asked of them.
> those machines that used to be brutally efficient at doing what you asked of them.
In a way, that's a natural thing to happen. Computer systems are weird in that we have an extremely good model of how they work at a certain level: "processor will (almost) always process assembly exactly as it should, etc."
When computers first came around, this "computer does exactly as you say" thing was weird to people because there weren't really any such things before computers. People would make small mistakes in their code and wonder why the computer can't "just do the obvious correct thing like any sensible person".
However, as multiple complex software systems interact and build on top of each other, this ability to "understand everything" at one of these "perfect model" levels (e.g assembly, logic gates, state machines...) becomes useless. To work with them, fuzzy models (like those we use for other humans or animals) are more appropriate. Complex software systems will routinely not do the same thing when given the same command. They will change over time, break in incomprehensible und unpredictable ways, and, as the newest addition, argue with you instead of doing what you say. That development makes total sense. At least those systems can now "take a hint".
Well before AI co-pilots something happened to the good old admins--they started to disappear only to be replaced by "AWS devops" (their job titles) who have never wired a network using routers, switches, and cables. I noticed that they started lacking basic networking knowledge and couldn't set up networking inside AWS. They just didn't know what a gateway, NAT, or subnet is.
Similar things are happening with AI co-pilots. We have an increasing number of people who "write code", but the number of people who can understand and review code is not increasing. There is also a problem of injecting ethics and politics into those tools, which can produce silly results. I asked Bard to write me a Python function to turn the US Constitution into a palindrome. Bard refused and gave ma a lecture on how the US Constitution is too important to be played with in such trivial fashion. I then asked it to produce code that turns the US national anthem into a palindrome, it refused again. So I asked it do the same but with the Russian national anthem and it spat out code without telling me off. I then asked it to generate code for simple tasks and it did an OK job, except the formatting and the fonts used were different every time, because it just lifted code from different webpages and recombined it like a massively hungover student waking up to realise he's supposed to hand in the assignment in one hour.