The difference is that you can statically link GTK+, and it'll work. You can't statically link glibc, if you want to be able to resolve hostnames or users, because of NSS modules.
Not inherently, but static linking to glibc will not get you there without substantial additional effort, and static linking to a non-glibc C library will by default get you an absence of NSS.
Multiple versions of GTK or QT can coexist on the same system. GTK2 is still packaged on most distros, I think for example GIMP only switched to GTK3 last year or so.
GTK update schedule is very slow, and you can run multiple major versions of GTK on the same computer, it's not the right argument. When people says GTK backwards compatibility is bad, they are referring in particular to its breaking changes between minor versions. It was common for themes and apps to break (or work differently) between minor versions of GTK+ 3, as deprecations were sometimes accompanied with the breaking of the deprecated code. (anyway, before Wayland support became important people stuck to GTK+ 2 which was simple, stable, and still supported at the time; and everyone had it installed on their computer alongside GTK+ 3).
Breaking between major versions is annoying (2 to 3, 3 to 4), but for the most part it's renaming work and some slight API modifications, reminiscent of the Python 2 to 3 switch, and it only happened twice since 2000.
We definitely can, because almost every other POSIX libc doesn’t have symbol versioning (or MSVC-style multi-version support). It’s not like the behavior of “open” changes radically all the time, and you need to know exactly what source symbol it linked against. It’s really just an artifact of decisions from decades ago, and the cure is way worse than the disease.
The problem is not the APIs, it's symbol versions. You will routinely get loader errors when running software compiled against a newer glibc than what a system provides, even if the caller does not use any "new" APIs.
glibc-based toolchains are ultimately missing a GLIBC_MIN_DEPLOYMENT_TARGET definition that gets passed to the linker so it knows which minimum version of glibc your software supports, similar to how Apple's toolchain lets you target older MacOS from a newer toolchain.
Yes, so that's why freezing the glibc symbol versions would help. If everybody uses the same version, you cannot get conflicts (at least after it has rippled through and everybody is on the same version). The downside is that we can't add anything new to glibc, but I'd say given all the trouble it produces, that's worth accepting. We can still add bugfixes and security fixes to glibc, we just don't change the APIs of the symbols.
It should not be necessary to freeze it. glibc is already extremely backwards compatible. The problem is people distributing programs that request the newest version even though they do not really require it, and this then fails on systems having an older version. At least this is my understanding.
The actual practical problem is not glibc but the constant GUI / desktop API changes.
In principle you can patch your binary to accept the old local version, though I don't remember ever getting it to work right. Anyway here it is for the brave or foolhardy, here's the gist:
2. Replace libc.so with a fake library that has the right version symbol with a version script
e.g. version.map
GLIBC_2.29 {
global:
*;
};
With an empty fake_libc.c
`gcc -shared -fPIC -Wl,--version-script=version.map,-soname,libc.so.6 -o libc.so.6 fake_libc.c`
3. Hope that you can still point the symbols back to the real libc (either by writing a giant pile of dlsym C code, or some other way, I'm unclear on this part)
Ideally glibc would stop checking the version if it's not actually marked as needed by any symbol, not sure why it doesn't (technically it's the same thing normally, so performance?).
This isn't a problem in other languages because most other languages don't have strong, statically typed errors that need to compose across libraries. And those that do have the same problem.
The general argument against adding something to `std` is that once the API is stabilized, it's stabilized forever (or at least for an edition, but practically I don't think many APIs have been changed or broken across editions in std).
The aversion to dependencies is just something you have to get over in Rust imo. std is purposefully kept small and that's a good thing (although it's still bigger and better than C++, which is the chief language to compare against).
I don't know, I've known many people that struggle with exams even if they know the material and even more people that excel with exams that learn nothing. Falling back on any kind of exam is just a recipe for more rote learning and that doesn't create better people (although possibly better readers, which we need).
(Preface: I am not a teacher, and I understand this is a hot take). At the end of the day there's an unwillingness from every level of education (parents, teachers, administrators, school boards, etc) to fight against the assault on intelligence by tech.
I don't think kids should have access to the public internet until they're adults, and certainly should never have it in schools except in controlled environments. Schools could create a private networks of curated sites and software. Parents don't have to give their kids unfettered access to computers. It's entirely in the realm of possibility to use computers and information networks in schools, accessed by children, designed to make it impossible to cheat while maximizing their ability to learn in a safe environment.
We don't build it because we don't want to. Parents don't care enough, teachers are overworked, administrators are inept, and big tech wants to turn them into little consumers who don't have critical thinking and addicted to their software.
I see this line of argument more and more over the last decade and it makes me feel heartless for my opinion.
But if you know the material but cannot apply it in an examination then you either don't actually know the material or don't have the emotional (for lack of better term) control to apply it in critical situations. Both are valid reasons to be marked down.
> don't have the emotional (for lack of better term) control to apply it in critical situations
No, not really, it just means you couldn't apply it in this one particular anxiety-inducing situation.
If someone finds it easier to display their knowledge in a certain way then school should strive to accommodate that as best they can (obviously there are practical limitations to this).
Mental health should be left to mental health professionals because you won't achieve anything by punishing students for their mental health struggles, you just make them hate you, hate school, and make their anxiety even worse.
I would argue that "knowledge" is an almost meaningless concept on its own. What assessments measure is a more complex form of "competency", and the competency of being able to write an essay on a topic is different from the competency of passing an MCQ quiz about it and both are different from being able to apply it in the field.
I don't have a clear solution, other than to have the assessments depend on what we're preparing people for. As an extreme example, I don't care how good of an essay a surgeon or anesthesiologist can write if they can't apply that under pressure.
But on the topic of test anxiety: I think intentionally causing emotional distress to children for the purposes of making a bad evaluation of their studies is cruel. It's a kind of cycle of trauma - "I did this, so you must to." We use grades to make value judgements of the quality of our children, when what we should be measuring is the ability of our schools to educate them and not how well-educated _the kids are_. The system is backwards, basically, and the fact it causes distress as a side effect is something that _should_ be managed - not ignored.
However anxiety exists and teaching children not to manage it is also bad. One of the really good things I've seen locally is that my school districts (the same that I went through as a child) focus on emotional education at the grade school level much more than when I was a kid, and I notice that the kids have much better emotional regulation than my generation.
Children should and must be allowed to fail. In fact, failure is the default outcome most of the time.
I wish I had learned in childhood that doing my best was enough. Not being the best, just doing my best.
But no, this is a lesson I learned from sim racing, as an adult, during the COVID-19 quarantine, as there was not much else to do.
What did I learn from sim racing:
— If I make a mistake, and I keep thinking about that mistake, I will just make more mistakes. Mental recovery, and not punishing myself, is a must. I must go back to mental clarity as fast as possible, to avoid making another mistake.
— Sometimes, doing my best is not enough. It can even be worthless. Other people make mistakes and that will ruin your race. In a long season, this can be offset by consistently good results. “It is possible to commit no mistakes and still lose. That is not a weakness; that is life.” — Jan Luc Picard
— I should not respect this driver because he has a famous last name or so. But I must respect that he did 600 laps preparing for the race. And my respect should be that I also practice as much. Preparation is important, we can't just go to a new track and expect to win. The winner is usually the best combination of general experience and event preparation.
— Nothing feels better than a victory that's hard-earned, against a talented group. Easy victories just feel cheap in comparison.
> I've known many people that struggle with exams even if they know the material and even more people that excel with exams that learn nothing.
This point is overstated. The former did not knew the material as well as they think and frankly, unless the exam was super badly done dont exist.
There are some people who fail in stress situation, but not that many of them. If you have met many people like that, you was most likely in a culture where people did not learned well and then blamed inability to test.
But even more importantly, the people who pass tests again and again without learning anything are not a thing. There are some badly designed tests here and there, occasionally. But in most cases, even if the test is not measuring the correct thing, you wont pass it without learning and knowing things.
Absoluteky not. Actually having to contruct the flashcards embeds the information in your head to deeper level than 10 reviews could
Same with taking notes in class. You can never look at them again but the most benefit comes from having to organize the information in the first palce
I think it depends on the student, but I think you are probably overall correct. As someone who hated reading most of my textbooks, there is absolutely no way I am going to effectively extract relevant flash card material out of them better than an LLM can. I'm going to get bored and my mind will probably wonder and start thinking about other things while I am "reading".
I assure you that if you have that problem, going through flashcards will be even worst. Flashcards are the most mind numbing boring way to learn.
The goal is not "to produce flashcards". The goal is to know the content. And learning off randomly selected factoids without overall structure is just dumb way to learn.
Both can be bad. What's hard to do though is convincing the people that work on these things that they're actively harming society (in other words, most people working on ads and AI are not good people, they're the bad guys but don't realize it).
No, apple does not buy production capacity to prevent others from using it. They buy it to use it themselves.
The wafers are not DRAM. This is more likely burning oil wells so your enemy can't use them. Wafers are to chips what steel blanks are to engines. You basically need clean rooms just to accept delivery and entire fabs to do anything. Someone who doesn't own a fab buying the wafers is essentially buying them to destroy them.
If industry has a bit of fear that the demand will slow down by the time they can output meaningful amount of chips, then probably not. Time will show.
reply