Hacker Newsnew | past | comments | ask | show | jobs | submit | klez's commentslogin

> I'm aware there are some important emails - but I challenge that there aren't as many as you think there are

The problem is not whether I think it's important. The problem is the customer thinking that's important. Or simply that I need to be aware of what they wrote. Or that I need to be aware of what another vendor wrote.

I'm not saying this is right, I'm saying it's where I'm currently sitting at.


On the other hand I find that the real value of such posts are in the comment section here on HN, especially with posters who don't agree with the article.

> You can cover almost everything with a handful of monthly subscriptions these days.

Maybe for you that's something you can afford. I can't. I just consume less music. Or sail the high seas if I really want something.


If we're purely talking about music then almost everything is on YouTube, which has a subscription cost of $0/mo.

Looks like the only way to get it is via CVS(!). Am I missing a link to the tarball somewhere? I suppose viewing it on mobile doesn't help

Isn't this limited to the US?

Yes, but you can bet that if they succeed with this in the US they will try something similar in the EU. They're constantly testing the waters.

Probably for the same reasons

I wonder: is there any reason beyond sheer curiosity* to learn Fortran in 2025? Not being snarky, genuinely curious about what Fortran brings to the table.

* Nothing wrong with that as a reason, of course


Yes; I am seriously thinking of getting the "Modern Fortran Explained" book (https://academic.oup.com/book/56095). Modern Fortran has everything (Procedural, OO, Functional, Multithreading, Parallel etc. features) that all modern languages have. I would say it is no longer "just" for numerical computing. Excellent libraries/ecosystem, a well supported and stable language, fantastic optimization right out of the box etc. makes it a great general purpose language.

Fortran in Modern Scientific Computing: An Unexpected Comeback - https://medium.com/@stack1/fortran-in-modern-scientific-comp...

5 Reasons Why Fortran is Still Used - https://www.matecdev.com/posts/why-fortran-still-used.html

Is Fortran better than Python for teaching the basics of numerical linear algebra? - https://loiseaujc.github.io/posts/blog-title/fortran_vs_pyth...

I take back everything i said about FORTRAN - https://x.com/ThePrimeagen/status/1745542049284423973

Modern Fortran online tutorial - https://wvuhpc.github.io/Modern-Fortran/


I suspect that nearly all of the Fortran code that will exist ten years from now already exists today, so knowing the language is a skill that is more likely to be useful to you for performance testing and code porting than for new development.

The trickiest part of really learning Fortran today is that it is hard to define what the language is, apart from the practical definition imposed by what its seven or so surviving compilers accept and how they interpret it. There are near-universally portable features that are not part of the ISO standard; there are standard features that are not at all portable, or not available at all anywhere. So what one should know as “Fortran” is its reasonably portable intersection of features across multiple compilers, and there isn’t a good practical book that will teach you that.


Yes. Some scientific computing code is still being developed in Fortran eg in HPC. (and has been for decades)


Fortran is still the best way to write scientific computing codes for HPC environments. And not just because of legacy code. You know how all those language benchmarks have C as the reference class (and fastest)? Fortran is faster still, by a significant margin.

One has to hope OpenAI goes bankrupt and they flood the market with used RAM sticks at bargain bin price.


Don’t worry, they’ve cozied up for a nice fat bailout if that happens.


They bought unfinished components and have no ability to finish them. They're now waste just sitting on shelves and by the time we could do this and get them into production lines, they'll be obsoletd by DDR6.


Yeah, that is my main worry - that whatever they ordered is actually unusable for normal people once they go bankrupt, so there is not only nothing to auction of to the creditors, but also nothing to alleviate the shortage in the short term.


As with all Ponzi schemes OpenAI will eventually go bankrupt, yet it continues to receive money from the unwary. I wonder how the research division at Disney feels about what their bosses have done...

But it should happen earlier due what is happening with the RAM, as it sounds quite illegal, like anticompetitive hoarding, cornering the market, raising rivals' costs, consumer welfare harm, and so on.


> besides the risk that an OS update will break this app

Tangential, but what a sad state of affairs is that an OS update can break your app. I'm not a windows user (not voluntarily, at least), but I always appreciated the stability and retrocompatibilità that allowed old apps to run unmodified on modern systems. I heard they dropped the ball on this as well, though.


> This drives up the "real" price of these purchases because of the time constraint.

I'd say you answered your own question.

> telco's tend to be oligopolies and tend to also do some form of price collusion among themselves, it was generally accepted as "just how things were"

So are LLM companies, at this stage

> Am I missing something?

Not really

> Is this really how things should be?

No, this is how thing are. I'm more than interested in ways to change the status quo, though.


How you consume social media is not what everyone does.

For example my SO spends hours on end on Facebook. Depending on whether you consider it social media I sometimes sink a lot of time (think hours) on YouTube. And that's time we're not spending on reading.

In light of this the question doesn't seem as twisted.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: