Hacker Newsnew | past | comments | ask | show | jobs | submit | bramhaag's commentslogin

This is really funny coming from Durov, CEO of an IM app that doesn't even have E2EE on by default (or even available for group chats). Both WhatsApp and Telegram are terrible choices.

Unfortunately most people couldn't care less. Bluesky has been lying about being decentralized since day 1, and yet they have millions of users.

Bluesky has been asymptotically approaching full decentralisation. A few years ago the gap was everything except a decentralised design, then it was AppViews, now it's "tooling and documentation" for the bit of the PKI that only 50 entities have done.

Meanwhile I lost my Mastodon account history because I moved once, couldn't interact with half the network or apps because I was on a non-Mastodon codebase instance, lost my account again because I stopped paying for access to the instance I was on, all classic signs of centralisation.


  > all classic signs of centralisation.
No, these are classic signs of decentralization.

  >  I lost my Mastodon account history because I moved once
Your posts still exist on every server that federated with you, there's just no central authority to coordinate reclaiming them.

  > couldn't interact with half the network or apps because I was on a non-Mastodon codebase instance
Independent implementations having compatibility issues is what happens when there's no central authority enforcing conformance. Frustrating, yes, but it's a symptom of decentralization.

  > lost my account again because I stopped paying for access to the instance I was on
That's just how paying for services works. You could host your own instance, and nobody but yourself can revoke your access.

On Mastodon, if something goes wrong, nobody can cut you off the network entirely. On Bluesky, the author deleted an empty test account and is now blacklisted network-wide until Bluesky support decides to help. That is a classic sign of centralization.


Being beholden to a particular server I have no control over sounds like what happened with Twitter/X.

The posts might exist, but they aren't associated with me. Why not? Because I was locked into somewhere and unable to vote with my feet and go elsewhere.

Maybe I stopped paying because the instance owner enforced sanctions against my country? Why should I lose my identity because of that?

> Independent implementations having compatibility issues is what happens when there's no central authority enforcing conformance. Frustrating, yes, but it's a symptom of decentralization.

Compatibility issues means lock-in to instances under individual control. Shared protocols means lock-in to a protocol, but ultimately freedom to move. We know that open protocols trumps opt-in collaboration by private entities for freedom.

> You could host your own instance, and nobody but yourself can revoke your access.

See also: instances not federating with other instances that are too small. You technically can, but in practice it goes nowhere.

> On Mastodon, if something goes wrong, nobody can cut you off the network entirely.

Bluesky is not perfect, but where it's approaching full decentralisation quickly on a solid foundation, ActivityPub has become the Mastodon show, and is less a decentralised social network, and more a federated set of centralised services with little accountability to users. You can't move, you can't control the content you see, you can't even search. It's a reversion to the days of 14 year olds drunk on power as a mod on a phpbb forum, or the Reddit mods of today.


I've realised that social networks are real–time feeds, not archives. Some archival features can be useful but they are not the main focus of the product. Archival needs are very different from real–time needs and combining them in the same product doesn't work out well.

Consider something simple like Slack: the selling point is that you can send messages to people. Being able to scroll back to last week is useful. Being able to scroll back 3 years is a nonessential bonus.


They are at 0.1% decentralisation, how can you extrapolate asymptotic decentralisation from that?

I honestly can't tell if this comment is trolling.

I'll admit it's a bit charged, but I'm frustrated with bad faith takedowns of ATProto/Bluesky, while Mastodon (and it is Mastodon, not ActivityPub) solves almost none of the actual problems. I tried implementing my own ActivityPub server and the spec is so hilariously lacking that it's understandable that everyone just uses the Mastodon API instead.

ActivityPub isn't actually the spec of Mastodon. Treat claims of "Mastodon is ActivityPub" the same as you treat claims of "Bluesky is decentralised."

Just expose the same interface Mastodon does and you'll be fine. Noting that almost nothing cares about the exact URLs you use, except for webfinger, but does care about the domain being the same as the right side of the @ sign.


> Treat claims of "Mastodon is ActivityPub" the same as you treat claims of "Bluesky is decentralised."

Not sure if you meant this in the way I read it, but I believe that Bluesky is pretty much decentralised and tidying up the last bits of that, and I also believe that Mastodon is functionally ActivityPub and probably mopping up the last bits where the open spec meant anything.

The problem with ActivityPub is that it was missing at least half of what would be necessary to do anything with it, maybe more. You certainly can't create clients with it, it doesn't define anything about writing, etc. It's good that it's an open spec, but I see it as closer to Open Graph tags on web pages than it is to a social network foundation. That's fine... but we treat "Mastodon" as open because of ActivityPub, when in reality almost the entire system is defined by a Rails API implementation and its idiosyncrasies. I see it as a problem that you can't participate in the network without implementing an API with one implementation, rather than by implementing to a spec.


> Bluesky is pretty much decentralised

????? what data could possibly lead to this conclusion?

https://arewedecentralizedyet.online/


> Mastodon (and it is Mastodon, not ActivityPub) solves almost none of the actual problems. I tried implementing my own ActivityPub server and the spec is so hilariously lacking that it's understandable that everyone just uses the Mastodon API instead.

Misskey is an independent implementation, and actually what the biggest server instance runs (or at least was a few years ago).


I think most Bluesky users were happy with a centralized Twitter as long as the people running it were ideologically aligned.

I think a lot of those users do care but they don't know they've been lying.

I switched from GitLab to Forgejo for my private projects after not wanting to deal with how slow GitLab's interface is anymore.

I still have proper CI, issue tracking, and all other features I care about, but the interface loads instantly and my screen isn't filled with many features I'll never use for my private projects.


Forgejo's code review tool slavishly follows GitHub (like a lot of other things it does) and so has the same inferior developer workflow that comes with that.

GitLab is no Gerrit, but it does at least support stacked MRs, and at least seeing comments between forced pushes / rebases, if not tracking them.

I use Codeberg, and therefore Forgejo for my open source project, but frankly the GH style workflow is not appropriate for serious software development. It forces one to either squash all commits or use <gag> merge commits. Many people have developed stockholm syndrome around this and can't imagine any other way. But it sucks.

The GH model encourages big-bang giant PR all at once development, and it's corrosive on productivity and review culture inside teams. And it leads to dirty commits in the git history ("fix for review comments." "merge." "fix for review comments." etc)

I worked with GitLab for a year and a half on a job, and I prefer its review tool for functionality, though not necessarily UX.


I came here to say this. I switched to forgejo after self hosting gitlab for years, and haven't missed anything.

The article mentions the container registry as a prime feature of gitab. Forgejo has this, btw.

In addition, speed (of everything) is so good with forgejo. The resource requirements (napkin math, but...) are 10% of gitlab.

I see no reason to ever use GitLab again.

There are two minor annoyances for me, but not deal breakers. . First, I actually prefer the GitLab CI syntax. "GitHub Actions" is a mess. I suppose it makes sense to use the dominant gorilla (github actions), but converting to this CI was more trouble than it should have been.

Also, the forgejo API: it is much less developed. I did like exploring with GraphQL which is totally missing in forgejo. But, you have access to the database directly (and can use sqlite or postgres, your choice) so you can really do whatever you want with a custom script. Forgejo API and their infrastructure around it is just a bit more clunky, but nothing that was a major problem.


Here's a link to a previous HN discussion on forgejo

https://news.ycombinator.com/item?id=42753523


One of the reasons I like how lightweight GitTea/Forgejo is allows me to develop with argocd locally. Spin up a kube cluster with tilt, bootstrap forgejo, bootstrap Argo and point it at forgejo, now I can test Appset devs with sync waves locally

Just tried it out for a bit, and it looks great and is super snappy. It seems the CI portion is delivered by a project Woodpecker? How does this work and is compared to gitlab CI?

Forgejo has an integrated CI/CD solution, Forgejo Actions [1], that is very similar to GitHub Actions (and thus not so similar to GitLab CI). This is what you'll probably use if you self-host.

Codeberg (a public Forgejo-based forge) also offers Woodpecker CI. Their hosted Forgejo Actions is still in beta AFAIK, but you can also use your self-hosted runners.

[1] https://forgejo.org/docs/next/user/actions/quick-start/


Is there any point to switch to Forgejo for my open source projects? Wouldnt I just be leeching resources from the guys at Codeberg/ whereever instead of Microsoft?

Codeberg _wants_ to host open source projects; it isn't leeching any more than adding articles to Wikipedia.

If you feel guilty, you can self-host Forgejo, contribute to Forgejo, or become a Codeberg member and pay them a yearly fee of your choosing (https://join.codeberg.org/).


Don't forget about Alzheimer's disease

Only tangentially related, but today I found a repo that appears to be developed using AI assistance, and the costs for running the agents are reported in the PRs. For example, 50 USD to remove some code: https://github.com/coder/mux/pull/1658

Lol, this is my PR. That cost is misleading. That workspace did far more than that change. In reality I spend ~$1000/week in tokens for all of my development work, and I'm quite happy with the exchange.

This is the cost of Anthropics pay by the token plan.

To give an analogy, Anthropics pricing is $0.10 per grain of rice (pay by the token) or;

$20 a month for quarter cup of rice each day (claude pro)

$100 a month for 10 cups of rice each day (claude max 100)

$200 a month for a sack of rice delivered to your door each day.

It's a rather insane pricing scale and here they are paying $50 there because they don't understand the pricing model (which is fair, Anthropics pricing model is crazy). Never pay by the token with Antropic! Only ever use the subscription plans.


At work we pay per token because we use a third party tool (amp) and it is very pricy. OTOH the subscription model is obviously not profitable. It's priced to capture market share, like Uber in 2010. (TBH the token pricing is probably not profitable either.)

From the point of view of a large entreprise, it makes more sense to pay a third party that can itself swap out Claude for Gemini (for example) based on pricing, then to buy subscriptions to one specific tool that is likely to suddenly cost 10x as much when they run out of VC funds. The dynamic is going to be different for individuals and small companies.


I'd like to suggest that dev lower their settings: sota model + high + thinking is definitely not needed to do this simple task. Lower settings could easily do it for less than $0.50, maybe even $0.05. I'd encourage people to operate on average to low settings and wind then up or down depending on the prompt task complexity.

Also helps to use big models for planning, small for implementation.

That seems reasonable compared to an actual developer (depending on your region), but I had hoped for these models to make simple tasks like this fast and cheap so those developers can focus on the difficult stuff.

Yes in practice I've done fairly simple tasks using AMP at work and ended up with bills of 100-150$ for them which is somewhat less (but similar order of magnitude) as contracting out to LCoL countries. Depending on how the technology (towards cheaper cost) and financial situation (towards profitability and higher costs) evolve, it wouldn't be shocking to me that the frontier model end up more expensive than human contractors.

One could of course write and debug the code themselves


Which is essentially just this:

    yt-dlp -o "%(channel)s/%(playlist_title)s/%(title)s.%(ext)s" -a playlists.txt
I'm not sure if that warrants a HN post


Right? Just add this to .bashrc:

alias yt-pl='yt-dlp -o "%(channel)s/%(playlist_title)s/%(title)s.%(ext)s" -a playlists.txt'


Mattermost is MIT licensed. What is stopping anyone from removing this restriction?


Maintaining your own fork is a ton of work. Even if it's just routinely rebasing on upstream and maintaining your own upgrade infrastructure and doing releases, that's far from trivial.

The open source community really needs to stop with the "just fork it" mindset.


> Maintaining your own fork is a ton of work. Even if it's just routinely rebasing on upstream and maintaining your own upgrade infrastructure and doing releases, that's far from trivial.

Well I did it for Mattermost and for some other software as well. Sure, its some work, but it's not "a ton" of work and may not be "trivial" but it is also not "far" from trivial.

Do it like Linux maintainers maintain a ton of patched RPM's, deb's, etc. Just keep a patch in GIT. For every release of Mattermost you do a GIT clone, apply your patch and build it. Most of the time the patch will just apply cleanly. Sometimes you need to make a few adjustments, you make them and put them in GIT. There is no extensive release management or anything. You just build a patched version for every released version.


> The open source community really needs to stop with the "just fork it" mindset.

It's right mindset. Just not applicable to projects that are made majority by the company because none of the contributors will move so it's essentially trying to make new team from scratch.


I don't think the implication is that anyone as an individual would fork it.

I think the implication is that some other interested org could very easily step in and assume the role that the Mattermost org was in, and everyone would very eagerly switch and leave Mattermost itself speaking to an empty room.


Still need someone to do unthankful work, in which many are not interested, naturally.


You actually don't have to maintain the fork and/or update to latest version if you don't need new features.


You don't have to maintain the fork and/or update to the latest version if you don't need new features or security fixes.

Most people want security fixes.


Or patched vulnerabilities.


>The open source community really needs to stop with the "just fork it" mindset.

The open source community really needs to stop with the "just do everything i want for free" mindset.

I mean, open source does not mean you're entitled to free support, and free in free software is not about money. I think people depend too much on those projects and then act entitled.

Of course the open source bait and switch done by companies is a shitty behavior worth calling out, but the companies exist to earn money and at this point this can be expected.


From my observation Mattermost is not a software you buy "support" for. It either works and is self-manageable or you use something else. I guess Mattermost (as in the company) saw that too and now uses shitty practices to coerece people into buying it.


I don't think I've expressed a "just do everything I want for free" mindset. In fact, I'm pushing against the idea that someone should just fork Mattermost and maintain that fork for free.

I do think this development represents a bait and switch though.


> Of course the open source bait and switch done by companies is a shitty behavior worth calling out,

Yes, that’s what we are doing here.

> but the companies exist to earn money and at this point this can be expected.

Expected != ethical. Also not a necessary, logical outcome.

What is legitimately expected is a pro version that has more corporate features. We’re not talking about $Xx/user/mo to enable SSO here, though.


I use MM for about a year. Forking it would be a major undertaking as the number of vulnerabilities for which you would need to backport is quite high like 5 a month?). Last time they removed features from free (group calls in v10) there was a lot of grumbling but thats it.


Maintaining a patch set for Mattermost is almost trivial. I did it for several years to authenticate users to internal Active Directory and found it easy enough to understand.


https://github.com/mattermost/mattermost/issues/34271#issuec...

Wanting to use Mattermost's binaries rather than building from source?

Re licensing see: https://isitreallyfoss.com/projects/mattermost/


It’s not open source, it’s “open core” SaaS.


I don't know, but that seems somewhat beside the point. The restriction obviously was not added to test peoples ability to remove it.


No. The binaries they prepackage for you are MIT. If you want the source it is AGPL or you pay for a proprietary license.


Nothing. Open Source is dying. The model to finance open source work (well-off suburban american dads or as a portfolio show off) no longer apply. The old generation that believed in this model is retiring and for the new generation it pays better to "network", leet code, or spam your resume to thousands of employers.

Now couple that with the fact that supply-chain control is profitable (legally or illegally); I think the next 5-10 years will be interesting.


There never was a model to fund open source. At least outside largest and most wide spread codebases. I think it is that reality is finally hitting. Free money has run out and now software must stand as either community efforts, wide enough used foundations or forced support.


I don't think anything has changed really. Open source never really had a good funding option.


almost seems like there is now too much money in software. the old times felt like computer science was mostly a science.


glancing through the code, it doesn't seem like it be that hard to remove limitations such as this. PostHistoryLimit/postHistoryLimit interpreted from License Limits. a little poke here and there and I'd guess the limitations would disappear.


The time and energy that it takes to do it and build it, and then make it easy for current users to move their automatic updates to the fork, then maintaining it etc.


The compiled binary is.

The source code is... AGPL licensed? But not the admin tools. They seem to be licensed under the Apache License 2.0.

--------

Yeah, good luck. Contact your lawyer.


AGPL and Apache are both open source licenses. So I’m not getting what the confusion would be as an end user, who won’t be modifying the software or packaging it for sale.


They're both FREE software licenses, which is more.

https://www.gnu.org/licenses/license-list.html


> Yeah, good luck. Contact your lawyer.

Why? The intent seems pretty clear and they're legally allowed to do this because all contributors signed a CLA.


Explain please. This interests me and I'm extremely curious about what you mean.


Combining source code under different licenses into one product is a nightmare.

You have to follow the AGPL "no additional restrictions" clause while also following the Apache License, and the Apache License might have require you to follow additional restrictions.


Honestly this has never been an issue for me, sure I have had to explain the limits of the licenses and check that I understand them. I guess it depends on your use case, so I am still uncertain when this has become a problem for you.


Believe it or not, Ubuntu is not the only Linux distribution.


Link is up, your ISP is likely blocking Anna's Archive.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: