I don’t get your argument at all. People want characters from their strings, and around the time Java decided on UTF-16 because at the time it seemed like the “right” way to do Unicode. What would you suggest they have adopted back then? Similarly C’s char type is named “char” because people dealt with ASCII back then and characters used to be a byte. It turns out that sucks but being able to do byte arithmetic is cool so it’s still around for that purpose (and C++ actually has added std::byte for exactly this; perhaps C will get it as well at some point). For Rust, this is just a thing about holding it wrong: the operation is generally not relevant, so why even expose it? It doesn’t make sense to allow for random indexing if you’re just going to crash on misalignment. It would be better to just have an API that doesn’t allow misalignment at all: see Swift’s implementation for example.
People should stop wanting "characters from their strings" especially in the sort of high level software you'd attempt in Java - and Java was in a good position to do that the way we've successfully done it for similar things, by not providing the misleading API shape. Reserve char but don't implement it is what I'm saying, like goto.
Compare for example decryption, where we learned not to provide decrypt(someBytes) and checkIntegrity(someBytes) even though that's what People often want, it's a bad idea. Instead we provide decrypt(wholeBlock) and you can't call it until you've got a whole block we can do integrity checks on, it fails without releasing bogus plaintext if the block was tampered with. An entire class of stupid bugs becomes impossible.
Java should have provided APIs that work on Strings, and said if you think you care about the things Strings are made up of, either you need a suitable third party API (e.g. text rendering, spelling) or you want bytes because that's how Strings are encoded for transmission over the network or storage on disk. You don't want to treat the string as a series of "characters" because they aren't.
The idea that a String is just a vector of characters is wrong, that's not what it is at all. A very low level language like C, C++ or Rust can be excused for exposing something like that, because it's necessary to the low-level machinery, but almost nobody should be programming that layer.
Imagine if Java insisted on acting as though your Java references were numbers and that it could make sense to add them together. Sure in fact they are pointers, and the pointer is an integral type and so you could mechanically add them together, but that's nonsense, you would never write code that needs to do this in Java.
K&R C claimed that char isn't just for representing "ASCII" (which wasn't at that time set in stone as the encoding you'll be using) but for representing the characters on the system you're programming regardless of whether they're ASCII. 'A' wasn't defined as 65 but as whatever the code happens to be for A on your computer. Presumably the current ISO C doesn't make the same foolish claim.