Rust's generics are entirely type-based, not syntax-based. They must declare all the traits (concepts) they need. The type system has restrictions that prevent violating ODR. It's very reliable, but some use-cases that would be basic in C++ (numeric code) can be tedious to define.
Generic code is stored in libraries as MIR, which is half way between AST and LLVM IR. It's still monomorphic and slow to optimize, but at least doesn't pay reparsing cost.
It’s impossible (?) due to the “coherence” rule. A type A can implement a trait B in two places: the crate where A is defined or the crate where B is defined. So if you can see A and B, you know definitely whether A implements B.
The C++ WG keeps looking down at C and the old C++ sins, sees their unsafety, and still thinks that's the problem to fix.
Rust looks the same way at modern C++. The std collections and smart pointers already existed before the Rust project has been started. Modern C++ is the safety failure that motivated creation of Rust.
The whole Epic vs Apple was about Apple blocking this. Before being slapped by regulators, Apple had anti-steering policies forbidding iOS apps from even mentioning that purchasing elsewhere is possible.
Even after EU DSA told them to allow purchases via Web, Apple literally demanded a 27% cut from purchases happening outside of App Store (and then a bunch of other arrogantly greedy fee structures that keeps them in courts).
Apple knows how hard is not to be in the duopoly of app stores. They keep web apps half-assed, won't direct users to them, but allow knock-off apps to use your trademarks in their search keywords.
Probably yes. It's ~300KB per binary, and it's a one-time cost.
It can be avoided entirely by disabling the standard library, but that's inconvenient, and usually done only when writing for embedded devices.
Usually the problem isn't the size directly, but duplication of Rust dependencies in mixed C++/Rust codebases.
If you end up with a sandwich of build systems (when you have library dependencies like C++ => Rust => C++ => Rust), each Rust/Cargo build bundles its copy of libstd and crates. Then you need to either ensure that the linker can clean that up, or use something like Bazel instead of Cargo to make it see both Rust and C++ deps as part of a single dependency tree.
The size is not fixed. It changes based on how much of the standard library you use. Dynamically linking the standard library is also a valid option in many cases.
Posted elsewhere but The default hello world stripped with one codegen unit and panic=abort was 342kB both nightly and stable. Adding lto dropped it 42kB in stable and 40kB in nightly. Adding build-std and only building core did not reduce it any further in size.
I agree, but if you use more of the std library it will contribute more to the final image. I can write a 100 line rust file that ends up being 1MiB (even after lto) because I maximize as much code from the standard library as possible. This is not a knock on rust, but your statements can be a misleading as well. In practice most folks ignore the majority of the standard library so only a few hundred kib of std library end up in their binary.
Bit late, but I made a small program that did network and file io as well as using a variety of containers and running system commands. I couldn't get the default release over 650kB. Using a single codegen unit lto strip and panic=abort got that down to 432kB. Using build-std didn't get it any smaller still. When I added optimization for size was the only way I got build-std to shrink things any further than the other options alone, and that only got me 10kB. My conclusion is that build-std is not a substantial contributor. Using std seems to add 300kB-500kB depending on how much you use. That seems like a lot to me because I am old, but elf binaries add several kB of header so maybe I should stop worrying so much.
If you build the standard library as a shared library it will be 4+MiB. The portion of that which you end up using is variable but there are ways to accomplish large usage without a great deal if code. I can get a 1.5 MiB binary down to 500KiB by dynamically linking the shared library. It's a net fun because I have many such binaries so it saves size in aggregate. It really does come down to what subset you use though.
I mean you get one upfront cost for things like allocators, common string manipulation and std::fmt, std::{fs, io, path} helper functions, and gathering of pretty backtraces for panics (which is a surprisingly fiddly task, including ELF+DWARF parsers and gzip to decompress the debug info).
A println!("hello world") happens to pull in almost all of it (it panics if stdout is closed).
Later code growth is just obviously proportional to what you're doing, and you're not getting a whole new copy of std::fmt every time you call print.
Has anyone here even read the article?! All the comments here assume they're building a package manager for C!
They're writing a tool to discover and index all indirect dependencies across languages, including C libraries that were smuggled inside other packages and weren't properly declared as a dependency anywhere.
"Please don't" what? Please don't discover the duplicate and potentially vulnerable C libraries that are out of sight of the system package manager?
Yeah it's pretty weird how people assume that -l<name> is supposed to work in gcc/clang across distributions, but somehow deriving which OS package gives you that lib<name>.so file is the devil.
Once a week you plug it in for ~30 minutes somewhere.
EVs charge unattended, so they can be left charging while you do something else. Shopping malls often have chargers.
At city distances and city speeds BEVs often have enough battery to last a week or two, and the battery doesn't drop when the car isn't used.
You don't have to charge to full if you don't have time. Even if you plug it in for 10 minutes, you'll probably return home with more charge that when you left.
I can't think of the last time I've willingly gone to a store to shop. I do all my shopping online and everything is delivery including groceries. Going to a physical store feels like a huge waste of time to me.
I drive quite a bit for work as I drive around and calibrate and repair lab equipment so it just seems like a major inconvenience to my schedule to have to go places to charge for awhile so often and hope they are working and hope they are not being used.
If you can't charge at home, can't charge at work, don't park at any other place, and still need to drive a lot, then this is indeed a tough edge case.
I don't have a charger at home, but when I'm travelling I stop at pubs, cafes, and fast food joints, so I have plenty of opportunities to charge.
Some cities have chargers in lamp posts for overnight charging of cars parked on the street. This should be more common! 300kW DC chargers that recharge in 20 minutes are an expensive equipment, but 3kW AC chargers are technically almost as primitive as a cable for an electric oven.
"It's enterprise" it's such a lame excuse, IMO. There shouldn't be a speed tax, especially if you aren't using the "enterprise" features. We need to stop making excuses for bad software. It's 2026 we have insanely powerful computers, why should I have to wait to get the result of searching a few text-only issues or display a diff?
In the UK just having batteries already helps. There's a surplus of wind at night. Shifting it to 5pm peaks pays back the cost of the battery quicker than solar panels pay for themselves.
Generic code is stored in libraries as MIR, which is half way between AST and LLVM IR. It's still monomorphic and slow to optimize, but at least doesn't pay reparsing cost.
reply