Ok, I’m not normally one to be the pedantic bits/bytes guy, but if you’re gonna go and make a bit/byte “clarification” you need to get the annotation correct or you'll just confuse everyone.
It’s 500kb (small b for bits) and 62.5kB(capital/big B for bytes).
People always use bits for connectivity. 62.5kB/sec -- maybe really 55-60kB/sec downloaded. Or 18 seconds to get a megabyte.
This is simultaneously fast (on my 14400 bps modem that I spent the most time "waiting for downloading", I was used to 12-13 minutes per megabyte vs. 18 seconds here) and slow (the google homepage is >1MB, so until you have resources cached you're waiting tens of seconds).
It would be nice if everything were just a touch more efficient.
I end up transferring 940kB (with a lot of blocking cranked up). Typing "hello" in the search bar takes it up to 1MB. Then the first page of search results is another 1.3MB.
Now, I assume all of this would start working before it's all transferred. But we're still talking about tens of seconds of transfer at 500kbit/sec.
(And Google at least acts like they care about bandwidth a little. So many 15megabyte pages out there...)
Unfortunately, the 56kbps internet was a lot more usable. I've been on 256kbps cellular connections (T-Mobile free international roaming) and it works, but it's pretty bad. Everything takes way more data these days, and nobody thinks about slow connections when writing software so there are a ton of overly aggressive timeouts and bad UI that assume operations won't take more than few seconds.
Data rates are almost always multiplied by powers of 10, because they're based on symbol/clock rates which tend to be related to powers of 10. There's no address lines, etc, to push us to powers of 2 (though we may get a few powers of 2 from having a power of 2 number of possible symbols).
So telco rates which are multiples of 56000 or 64000; baud rates which are multiples of 300; ethernet rates which are mostly just powers of 10; etc etc etc.
Of course, there's occasional weird stuff, but usually things have a lot of factors of 5 in there and seem more "decimal-ish" than "binary-ish".
> Figma runs untrusted user plugins in your browser by running them in a QuickJS engine that is compiled to Wasm.
According to the linked blog article, this is not what they are doing, but rather an option they explored. They use JavaScript Realm shims to isolate the execution.
Curious, what did you get out of it? Counseling? Some action plan? A reflection? Seems intriguing to do, but would like to know how it helped you exactly if you don’t mind sharing.
Career planning at the moment, tailoring resumes. Currently, it's not tailoring it well enough yet because it's hallucinating too much, so I need to write a specific prompt for that. But I know for work, where I do similar things (text generation with a human in the loop), that I can tackle that problem.
So yea, I definitely, add to the "AI generated" text part but I read over all the texts, and usually they don't get sent out. Ultimately, it's still a lot quicker to do it this way.
For career planning, so far it hasn't beaten my own insights but it came close. For example, it mentioned that I should actually be a developer advocate instead of a software engineer. 2 to 3 years ago I came to that same thought. I ultimately rejected the idea due to how I am but it is a good one to think about.
What I see now, I think the best job for me would be a tech consultant. Or as I'd also like to call it: a data analyst that spots problems and then uses his software engineering or teaching skills to solve that problem. I don't think that job has a good catch all title as it is a pretty generalist job. I'm currently at a company that allows me to do this but the pay is quite low, so I'm looking for a tech company where I could do something similar. Maybe a product manager role? It really depends on the company culture.
What I also noticed it did better: it doesn't reduce me to data engineering anymore. It understands that I aspire to learn everything and anything I can get my hands on. It's my mode of living and Claude understands that.
So nothing too spectacular yet, but it'll come. It requires more prompt/context engineering and fine tuning of certain things. I didn't get around to it yet.
> What I also noticed it did better: it doesn't reduce me to data engineering anymore. It understands that I aspire to learn everything and anything I can get my hands on. It's my mode of living and Claude understands that.
I'm really glad you are getting some personal growth out of these tools, but I hesitate to give Claude as much credit as you do. And I'm really cautious about saying Claude "understands" because that word has many meanings and it isn't clear which ones apply hear.
What I'm hearing is that you use it like a kind of rubber-duck debugger. Except this is a special rubber duck because it can replay/rephrase what you said.
That egg was totally still edible, even if you pierce it for the cooking, it should be good for at least a few days. If you do it right, it can be weeks.
In Germany you can buy cooked eggs in the super market and they are not refrigerated.
This article was weird, in that he went through the whole thing about how effective the layers are without also mentioning there was a hole through all of them other than the egg white (until the end).
That’s true, at least I would have eaten it. I did leave out boiled eggs with holes after cooking them for at least a day in the past. It depends of course, I don’t live in a super humid and hot climate, results may differ there.
The egg cooker is a mystery to me. What is the hole for? I hard-boil eggs by heating a pot of water containing the eggs. Why would you puncture the shell?
I do not know about TFA and the egg cooker, but I usually boil the water first so I can pull the eggs out at a specific time (6:30 to 7:00) so the yolk is gooey. When I put an egg into boiling water without putting a hole through to the air sac, I think there's a higher chance that the egg will crack and spew albumen throughout the boiling water.
It addresses this in the article, but in countries that wash the bacterial layer off (like the US), they have to be refrigerated. This is to minimize salmonella contamination, EU deals with this by vaccinating hens against salmonella instead.
Does the same apply for cooked eggs? I would have thought all bacteria is gone from the shell after the process, making EU and US cooked eggs virtually the same. A short search seems to agree with that.
it is very, very unlikely that a fully cooked egg that sat out overnight would have enough bacterial growth (and toxin production by bacteria) to represent a threat to a normal person with a typical immune system.
If the egg had bacteria already growing in it (which is also unlikely) then it's possible that enough of the bacterial toxins could accumulate and be inside when you eat it.
There are probably a number of extremely rare scenarios which would modify the situation but this article includes a lot of words and facts only to come to the unlikely conclusion.
Had a similar thought recently: With the advent of AI, custom software became extremely attainable. DHI syndrome suddenly becomes less of an issue - and can actually become a perk, as you build the most minimal software that works for your org, skipping potential vendor / saas fees. Really curious how the landscape of software will change in the next years due to that.
Can recommend visx as well, it's great as it's basically a thin layer around d3, giving you maximum flexibility while providing lots of built in components to get you started.
Maybe she is in a holdback experiment. To understand how a feature affects the metrics (such as running ads), they often have some people in a holdback. I worked there and we did have such experiments for our features.