Hacker Newsnew | past | comments | ask | show | jobs | submit | more writebetterc's commentslogin

The goal isn't to exchange facts, it's to bond.


Well I doubt I'm going to bond by being subjected to painful boredom, and neither will they. I've found it's so socially common to completely forget all of these things people tell you in small talk too, because I'd remember every one of these facts about the other person due to low frequency interactions, while they'll be like "I've never met this man in my life" the next time I talk to them and I can't help but find that a bit hurtful.


I think smalltalks just didn't bring you to the right topic, if you were to reach topic you both fancy, you would definitely remember each other. If anything, having a smalltalk is nicer than staying in awkward silence.


> I doubt I'm going to bond by being subjected to painful boredom, and neither will they

Which is why it's important to surround yourself with likeminded folk.

Nobody is compelling you to do this with every single person you meet.


What value does the word have? When I'm writing a compiler, it doesn't matter whether I target C or asm, or Javascript, as my output language. I'll still write it the same way.


OK, but words are not only for compiler-writers. As someone who encounters your compiler, if it targets an output language at a similar level as the input language it will give me a headstart in understanding what it does if I see it referred to as a transpiler rather than simply a compiler.

Overall, I find this discussion very odd. It seems like a kind of deletionism for the dictionary. I mean, what's the use of the word 'crimson'? Anything that's crimson is also just 'red'. Why keep 'large' when we have 'big'? You could delete a large percentage of English words by following this line of thinking.


It gives you a better idea what a thing does?


To me, it doesn't. If someone says "tsc is a transpiler", it gives me nothing actionable. If you do say "it transpiles to JS", then I've got something, but that could just be "compiles to JS". It doesn't really tell me how the thing is constructed either.


I'd probably say that "transpiler" is not a very useful word with that definition.


Why is it useless? 'Compiler' denotes the general category, within which exist various sub-categories:

For example, a 'native compiler' outputs machine code for the host system, a 'cross compiler' outputs machine code for a different system, a 'bytecode compiler' outputs a custom binary format (e.g. VM instructions), and a 'transpiler' outputs source code. These distinctions are meaningful.


I can’t see why — I do think that the word does convey some sort of useful meaning with the above definition.


I'm sure that if you went back 100 years you'd be less surprised, but of course the rice would've been replaced with oat porridge or potatoes.


The old European one would have been bread: the traditional 2lb/900g ish size loaf would have been consumed in a day. Apparently Turkey still has very high levels of bread consumption.


Germany as well, it is like bread with everything, sometimes feels like back in the middle age, using bread as plates.


Maybe they were on to something we've lost: bread is delicious.


> I think this is a dumb idea, a mix of wishful thinking and immature psychology. You become someone because of your competence, not of what you believe in.

Shit works, it's not dumb at all.


"good outcomes" doesn't have to be the best outcome for you, personally.


There's no part of me that wants to maintain relationships for the express purpose of extracting value in the future for gain -- personal or otherwise.

I simply refuse to let the end justify the means, whatever that end is.


Java doesn't produce 'garbage data', racing writes will never show intermediary results.


Loss of Sequential Consistency means realistically humans cannot understand the behaviour of the software and so "garbage" seems like a reasonable characterisation of the results. It might take a team of experts some considerable time to explain why your software did whatever it did, or you can say "that's garbage" and try to fix the problem without that understanding of the results.

The hope when Java developed this memory model was that loss of SC is something humans can cope with, it was not, the behaviour is too strange.


Even with sequential consistency, interweaving multiple threads produces results that go firmly into the garbage category. Even something as simple as adding to a variable can fail.

To solve interweaving, add mutexes. And once you have mutexes you'll be protected from weak memory models.


What is the end result of two threads adding +1 to the same counter 1 million times each in parallel?

Well, the result is safe but unknowable. I would call that garbage data.


Why should we care about the COSMIC Desktop Environment?

Edit: I've now gotten 2 downvotes in 4 minutes. I do not understand what's so controversial about this comment. Why should we care about having a third DE? Does this matter to users at all? I've watched several videos show casing it, and there seems to be no point to it except organizational (Pop OS wants to break free from GNOME).


Your comment could have been posted under almost ANY story on HN. "Why should we care about .... ?" In other words, it's a very low value comment as far as discussion goes. If you don't care about it move on, nobody said you had to care.


Because there are a number of existing frustrations with GNOME that can not be fixed in GNOME, because GNOME Foundation has their own specific vision, which not a lot of people like.

KDE is good but has its own flaws, and it's a different workflow


IMO KDE is not a different workflow. You can easily recreate all of gnome in KDE.

So, just use KDE.


Can KDE do dynamic workspaces now? That was one of the nicer things about GNOME last time I used it


Yes, it's in the overview.


you really can't. I have tried and I am trying every couple of years. I hope it gets there one day though

it's also usually more buggy for me, so there's that.


I legitimately cannot think of a single feature in Gnome, Windows, or MacOS that does not exist in KDE.

Now, granted, you would only know that if you're a KDE wizard. So maybe Gnome defaults coincidently align better with what you like.

But in terms of functionality, there just isn't any contest.


Making a single virtual desktop on my second monitor that remains static, while the primary switches between them Having a good screenshot tool on Wayland that freezes the screen, lets me select a region of the screen in an overlay, then puts the selection in my clipboard and vanishes jn the background. KDE one was insistent on spawning its window and wasn't very good. Third party tools failed on Wayland on region selection. Especially with scaling enabled.

Not having random Z-index order bugs is also a big one


Tried KDE. I very much prefer Gnome with extensions


Pro-tip: These are inside thoughts, no need to comment, just have the thought and move along.


Not being able to question large investments is not a healthy perspective. And as this is the first public release with Cosmic, I think it's completely fair for people to question it.


Sure but there was not much in that question.


Sorry, I don't see it that way.


Which is why you were downvoted.


Who is "we". But to answer your question, it was created because System76 didn't find an existing DE that met their needs even after extensive changes to them.


    Downvotes
As someone else mentioned, it's because it's a very low value comment.

It provides nothing to the discussion except a bad attitude.

It's clear from the fact you're asking it you don't about the topic. Go investigate Cosmic desktop, if you don't know why we should care about it, and you can find out for yourself whether or not it's worth caring about.

If you find out that you don't think this is something that should be discussed, don't up vote the thread and move on, simple as that. If the thread gets many upvotes anyways, you can infer people care about it, even if you don't. Comments like this only pollute the discussion and make everything feel negative.

It's a big part of the reason the internet has become such a drag, that people always feel the need to comment every little thing, even if it only adds negative value.


Perfectly valid question. I remember Ubuntu getting lost down a rabbit hole of their touch screen desktop wm that had soo many warts. I guess if you don't have the power to steer a project you fork or try something else. Then realise you neither have the UI chops or technical finesse to pull something off. Windows has been atrocious UI wise since Win 8, failed to pull anything off cohesively and just left a mess. Much like Ubuntu, but with what you would expect a well funded dedicated UI team.

UIs generally sick in Linux with the exception of the shell. And even that could be sexed up hugely.

The best thing about AI tools for me is that they make up for shortcomings in the UI and have become a very important go-between for me.


I personally care because it’s Rust based and that means I’m likely to participate in development and feel comfortable with the tooling.

As a broader picture — it matters since the creator is System76 which sells laptops and desktops and moves towards giving a full Linux on Desktop polished experience. You really can only go so far if you are not deeply involved in the DE yourself.

p.s.: your question is very legit imo, don’t get the downvotes


Intel Arc Pro B60 will come in a 48GB dual-GPU model. So yeah, hardware is gonna be there, and the 24GB model will be $599 from Sparkle. I assume 48GB will be cheaper than a hacked RTX 4090.

Look at this: https://www.maxsun.com/products/intel-arc-pro-b60-dual-48g-t... https://www.sparkle.com.tw/files/20250618145718157.pdf


Keep in mind that the dual-GPU is done via PCIe bifurcation, so that if you use two B60's on a similar motherboard to what's in the article, you'll only see two GPUs, not the full four. Hence just 48GB VRAM not 96GB.


Yeah, but the B60 is basically half the speed of a 3090... in 2025. I'd rather buy 5yr old nVidia hardware for $100 more on eBay than an intel product with horrendous software support that's half the speed effectively. This build is so cool because the 2x 3090 setup is still maybe the best option 5yrs+ after the GPU was released by nVidia.


There are old Arabic maps which have south at the top.


Ancient Egypt too. It was more natural for the Nile to run 'downwards'. This is also why upper Egypt is South and lower Egypt is North.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: