Hacker Newsnew | past | comments | ask | show | jobs | submit | Hamuko's commentslogin

Thankfully they added that horrible AIslop image right in the beginning of the article, so I knew instantly to not trust the author on anything.

I wouldn't want to build a business that was so dependent on a massive third-party that can either cut off my access or copy my design at any time of their choosing.

I was thinking about this and there are several aspects that can still make this viable. 1) AI labs are incentivised to increase token consumption because literally that's their product. The only thing they sell AFIAK are tokens (and maybe a teensy bit of user data). So if you build a product that is actively reducing token consumption (which they simply cannot do without hurting themselves even if their marketing fluff says otherwise) you'll save large amounts of money for your customers and they'll choose you. 2) Big providers want to funnel every prompt into their servers. If you're in a regulated market or simply don't want to share every detail with an American or Chinese megacorp you are in trouble. BUT open weight models are now quite capable for "small business stuff" and they can be self hosted. If you can bundle this into your service, in other words actually care about their privacy, they will choose you. Even more so if you're in Europe.

I'm shocked that big open-source projects are even using it. I was reading through the Actions documentation recently and it did make it pretty clear that you should not be using it for untrusted code.

>Running untrusted code on the pull_request_target trigger may lead to security vulnerabilities. These vulnerabilities include cache poisoning and granting unintended access to write privileges or secrets.

https://docs.github.com/en/actions/reference/workflows-and-a...


I feel like GitHub should deprecate it and replace it with pull_request_untrusted or something and have every shareable aspect (like cache or secrets) an explicit boolean opt-in

MPD doesn't really do streaming. If you install MPD on your server and then install an MPD client on your phone, the music will play on your server when you press play on your phone. You can re-encode the playback as an audio stream and stream that to your phone, but it's not really what MPD is built for.

MPD has built in HTTP streaming. I know at one point it didn't though.

https://mpd.readthedocs.io/en/latest/plugins.html#httpd

I note that they call it a plugin and also say the purpose of MPD isn't to stream.


Thanks for the correction; edited my comment to ensure nobody gets confused by my mistake. (I'm brand-new to using MPD, just installed it last week, so I had things backwards in my head).

I like having the music player be separate from the client that controls it. I'm currently listening to music on my home desktop PC while using my work laptop, and I can control the music playback from the work laptop by just connecting to the daemon.

You can also use multiple clients if you want. Some TUI, some graphical, some utility like mpdscrobble (that just watches what you listen to and scrobbles it to Last.fm).


Are you still on Last.fm? I finally deleted my account last year after the site had been a ghost town for over a decade. Long decline from the early millennium when seemingly every hipster in my town scrobbled and music was an IRL social thing. If I still cared for tracking statistics without the social stuff, I would use Libre.fm.

I use it for tracking statistics without the social stuff, except maybe the year-end stats. I currently have a workflow for getting my stats out of Last.fm and then using those for building playlists, and it’d not work with Libre.fm.

I haven't tried it myself but I've heard good things about Listenbrainz. Maybe something for you.

Elgato and Corsair have also made a keyboard that has the Stream Deck buttons + knobs on it.

https://www.elgato.com/us/en/explorer/products/stream-deck/g...


While it's a lot easier to read then Perl, it's still not as easy as something like a Python.

Probably started stealing shit before he was making $250k/year, and then just continued to do so because it works.

I can tell if my guitar's significantly out of tune, but no way I'm getting an accurate tune without a tuner.

>Also, have you noticed as top-end Mac Studios got downgraded recently? They don't want you to have access to frontier models. And you will not have it.

Isn't that a function of RAM supply not being available now?


OpenAI did buy out the RAM supply to block competition. Arguably local models are one of its (smaller) competitors.

Even if that weren't the case, every corp _needs_ you to be on a subscription.


They didn't really even buy the RAM. But there's pretty significant demand for RAM in general with data centers being planned left and right.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: