Hacker Newsnew | past | comments | ask | show | jobs | submit | spopejoy's commentslogin

How convenient that your analysis elides debt servicing from war and increased discretionary military spending. Debt rose the most under Reagan and Bush and now Trump with the same call for cutting taxes while bloating military spending. And let's not forget the TARP and other bailouts in 2008. But by all means, talk about "corrective market forces that curb waste" -- tell me, when has any government in history been run by "market forces"?

Don't get too complacent, the far right is on the rise in EU, and corruption follows.

corruption follows :') ... corruption is already there. just the color of the banner is changing. dont be naive.

EU isn't perfect, even the best nations within it are not perfect (though still better than the US is or was), but only the worst two EU nations seem to be politically corrupt to the point of "danger" levels.

There's plenty of room to fall if people are complacent.

General corruption: https://en.wikipedia.org/wiki/Corruption_Perceptions_Index


As a person from a country constantly near the top of that list, I have been saying this for more than two decades: holding the #1 spot in CPI tells nothing how well things are going for a country; it merely highlights how bad things are even for the runner-up.

I can't understand why denigrating someone as a prostitute or w**e is not called out as inappropriate if not fully misogynist. Its history is deeply, inescapably misogynist, it's anti-sex worker as you say, and it's just tacky. Corruption of morals for money doesn't need to be feminized to make an argument.

> no vision

Jack's trademark it seems.


Yeah the whole bitcoin stuff is cringe.

> in my experience there’s a large contingent of people, especially the youth, that are more reactionary about AI than they are interested in creativity.

First off -- are you an artist? As in, are you making your argument with skin in the game for something you _need_ to do, not just a pastime that makes dayjobs livable?

Not gatekeeping! Trying to see if you are formulating your position as a creator or a consumer.

If the latter, hate to say it, but your opinion is kind of irrelevant. Ultimately, only artists really understand what's involved in creating real art. Not what's good or bad, but what's at stake and how to tell if somebody's for real.

If you're a creator I'm a little puzzled. Are you really worried that AI is so freaking great that the horrible luddites at bandcamp et al are going to "gatekeep" us away from incredible AI art? This is NOT something that keeps me up at night.


I'm also getting really annoyed by AI-generated images like this article has that don't really help comprehension, but make the author feel like they're "pro blogging" because god forbid you have two paragraphs in a row without a subhead or an image.

Programmers complaining about AI but then ripping off umpteen illustrators' labor through AI is infuriating.


Sadly, journalists are super-addicted to X. They're a tentpole community for the platform at this point.


Amazing!

I was able to confuse the llm with the following command while in the living room: "go back outside and find mailbox" (which I had forgot to check before going in the window)

It got super confused, went in and out the window three times. It did actually recover and plop me outside suggesting I head north and west. Good times!


Humans and all other organisms are "literally" not machines or devices by the simple fact that those terms refer to works made for a purpose.

Even as an analogy "wet machine" fails again and again to adequately describe anything interesting or useful in life sciences.


Not really production level or agentic, but I've been impressed with LLMs for Haskell.

I think that while these langs are "niche" they still have quality web resources and codebases available for training.

I worry about new languages though. I guess maybe model training with synthetic data will become a requirement?


> I worry about new languages though. I guess maybe model training with synthetic data will become a requirement?

I read a (rather pessimistic) comment here yesterday claiming that the current generation of languages is most likely going to be the last, since the already existing corpus of code for training is going to trump any other possible feature the new language might introduce, and most of the code will be LLM generated anyways.


I've wondered to myself here and there if new languages wouldn't be specifically written for LLM agentic coding, and what that might look like.


I had the thought of an AI-specific bytecode a while ago, but since then it's seemed a little silly -- the only langs that work well with agentic coding are the major ones with big open-source corpuses and SO/reddit discussions to train on.

I also saw something about a bytecode for prompts, which again seems to miss the point -- natural language is the win here.

What is kind of mysterious about the whole thing is that LLMs aren't compilers yet they grok code really well. It's always been a mystery to me that tools weren't smarter and then with LLMs the tooling became smarter than the compiler, and yet ... if it actually was a compiler we could choose to instruct it with code and get deterministic results. Something about the chaos is the very value they provide.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: