Well I doubt I'm going to bond by being subjected to painful boredom, and neither will they. I've found it's so socially common to completely forget all of these things people tell you in small talk too, because I'd remember every one of these facts about the other person due to low frequency interactions, while they'll be like "I've never met this man in my life" the next time I talk to them and I can't help but find that a bit hurtful.
I think smalltalks just didn't bring you to the right topic, if you were to reach topic you both fancy, you would definitely remember each other. If anything, having a smalltalk is nicer than staying in awkward silence.
What value does the word have? When I'm writing a compiler, it doesn't matter whether I target C or asm, or Javascript, as my output language. I'll still write it the same way.
OK, but words are not only for compiler-writers. As someone who encounters your compiler, if it targets an output language at a similar level as the input language it will give me a headstart in understanding what it does if I see it referred to as a transpiler rather than simply a compiler.
Overall, I find this discussion very odd. It seems like a kind of deletionism for the dictionary. I mean, what's the use of the word 'crimson'? Anything that's crimson is also just 'red'. Why keep 'large' when we have 'big'? You could delete a large percentage of English words by following this line of thinking.
To me, it doesn't. If someone says "tsc is a transpiler", it gives me nothing actionable. If you do say "it transpiles to JS", then I've got something, but that could just be "compiles to JS". It doesn't really tell me how the thing is constructed either.
Why is it useless? 'Compiler' denotes the general category, within which exist various sub-categories:
For example, a 'native compiler' outputs machine code for the host system, a 'cross compiler' outputs machine code for a different system, a 'bytecode compiler' outputs a custom binary format (e.g. VM instructions), and a 'transpiler' outputs source code. These distinctions are meaningful.
The old European one would have been bread: the traditional 2lb/900g ish size loaf would have been consumed in a day. Apparently Turkey still has very high levels of bread consumption.
> I think this is a dumb idea, a mix of wishful thinking and immature psychology. You become someone because of your competence, not of what you believe in.
There's no part of me that wants to maintain relationships for the express purpose of extracting value in the future for gain -- personal or otherwise.
I simply refuse to let the end justify the means, whatever that end is.
Loss of Sequential Consistency means realistically humans cannot understand the behaviour of the software and so "garbage" seems like a reasonable characterisation of the results. It might take a team of experts some considerable time to explain why your software did whatever it did, or you can say "that's garbage" and try to fix the problem without that understanding of the results.
The hope when Java developed this memory model was that loss of SC is something humans can cope with, it was not, the behaviour is too strange.
Even with sequential consistency, interweaving multiple threads produces results that go firmly into the garbage category. Even something as simple as adding to a variable can fail.
To solve interweaving, add mutexes. And once you have mutexes you'll be protected from weak memory models.
Why should we care about the COSMIC Desktop Environment?
Edit: I've now gotten 2 downvotes in 4 minutes. I do not understand what's so controversial about this comment. Why should we care about having a third DE? Does this matter to users at all? I've watched several videos show casing it, and there seems to be no point to it except organizational (Pop OS wants to break free from GNOME).
Your comment could have been posted under almost ANY story on HN. "Why should we care about .... ?" In other words, it's a very low value comment as far as discussion goes. If you don't care about it move on, nobody said you had to care.
Because there are a number of existing frustrations with GNOME that can not be fixed in GNOME, because GNOME Foundation has their own specific vision, which not a lot of people like.
KDE is good but has its own flaws, and it's a different workflow
Making a single virtual desktop on my second monitor that remains static, while the primary switches between them
Having a good screenshot tool on Wayland that freezes the screen, lets me select a region of the screen in an overlay, then puts the selection in my clipboard and vanishes jn the background. KDE one was insistent on spawning its window and wasn't very good. Third party tools failed on Wayland on region selection. Especially with scaling enabled.
Not having random Z-index order bugs is also a big one
Not being able to question large investments is not a healthy perspective. And as this is the first public release with Cosmic, I think it's completely fair for people to question it.
Who is "we". But to answer your question, it was created because System76 didn't find an existing DE that met their needs even after extensive changes to them.
As someone else mentioned, it's because it's a very low value comment.
It provides nothing to the discussion except a bad attitude.
It's clear from the fact you're asking it you don't about the topic. Go investigate Cosmic desktop, if you don't know why we should care about it, and you can find out for yourself whether or not it's worth caring about.
If you find out that you don't think this is something that should be discussed, don't up vote the thread and move on, simple as that. If the thread gets many upvotes anyways, you can infer people care about it, even if you don't. Comments like this only pollute the discussion and make everything feel negative.
It's a big part of the reason the internet has become such a drag, that people always feel the need to comment every little thing, even if it only adds negative value.
Perfectly valid question. I remember Ubuntu getting lost down a rabbit hole of their touch screen desktop wm that had soo many warts. I guess if you don't have the power to steer a project you fork or try something else. Then realise you neither have the UI chops or technical finesse to pull something off. Windows has been atrocious UI wise since Win 8, failed to pull anything off cohesively and just left a mess. Much like Ubuntu, but with what you would expect a well funded dedicated UI team.
UIs generally sick in Linux with the exception of the shell. And even that could be sexed up hugely.
The best thing about AI tools for me is that they make up for shortcomings in the UI and have become a very important go-between for me.
I personally care because it’s Rust based and that means I’m likely to participate in development and feel comfortable with the tooling.
As a broader picture — it matters since the creator is System76 which sells laptops and desktops and moves towards giving a full Linux on Desktop polished experience. You really can only go so far if you are not deeply involved in the DE yourself.
p.s.: your question is very legit imo, don’t get the downvotes
Intel Arc Pro B60 will come in a 48GB dual-GPU model. So yeah, hardware is gonna be there, and the 24GB model will be $599 from Sparkle. I assume 48GB will be cheaper than a hacked RTX 4090.
Keep in mind that the dual-GPU is done via PCIe bifurcation, so that if you use two B60's on a similar motherboard to what's in the article, you'll only see two GPUs, not the full four. Hence just 48GB VRAM not 96GB.
Yeah, but the B60 is basically half the speed of a 3090... in 2025. I'd rather buy 5yr old nVidia hardware for $100 more on eBay than an intel product with horrendous software support that's half the speed effectively. This build is so cool because the 2x 3090 setup is still maybe the best option 5yrs+ after the GPU was released by nVidia.