If you go back further than that, teams used to destroy entire engines for a single qualifying.
The BMW turbocharged M12/M13 that was used in the mid-eighties put out about 1,400 horsepower at 60 PSI of boost pressure, but it may have been even more than that because there was no dyno at the time capable of testing it.
They would literally weld the wastegate shut for qualifying, and it would last for about 2-3 laps: outlap, possibly warmup lap, qualifying time lap, inlap.
After which the engine was basically unusable, and so they'd put in a new one for the race.
Current examples would be drag racing cars that have motors that are designed and used in a way that they only survive for about 800 total revolutions.
After another pair of kitchen scissors had the plastic handle fall off, revealing a tiny metal tang creating the inevitable weak spot, I went looking for an all-metal pair. Not at all easy to source as it turns out, there are only a small handful of even vaguely reasonably priced models available.
They're just scissors, how are we still as a species finding ways to make them shitter and more breakable? They're a solved problem and yet some guy managed to hit on a way make them out of melting plastic?
I don't need them to be laser etched or artisanal or blessed by levitating monks or whatever. Just two bits of metal with a rivet that will still be scissor-shaped when I'm dead and buried. It's not even that much more metal than plastic handled scissors, and the hard bit is in the edges anyway, not the handles. It's probably never been cheaper or easier to make not-shit things and yet there's just so much shit everywhere.
Slightly an aside, but you might try searching for dressmakers shears instead. Lots of those seem to be all metal. (Not sure what you consider a reasonable price however)
Funnily enough I do have a pair of exactly those from my grandmother (still very functional maybe 80 years on)! Drop-forged and everything. They were probably a week's salary or something!
> how are we still as a species finding ways to make them shitter and more breakable?
Products when they are first introduced tend to be overengineered, since the way they will be used is not entirely known. As this knowledge accumulates the products can be optimized to be just good enough (just strong enough, just durable enough). You should expect in equilibrium that products will be optimized for minimum cost at the minimum tolerable level of quality.
There's a booth in a consignment shop in town that has a load of old high carbon steel Case scissors of different sizes. Resale shops may be the way to go.
Labor has never been more expensive than now. Raw resources have never been under more competition than now. We as a species are more numerous and consume more scissors than ever before. Supply chains have never farther separated end user from producer, as now.
Scissors have never required less labour to make and steel is maybe not quite as cheap as 5 years ago but still mind meltingly cheap in historical terms. The metal in the handles of an all-metal pair of scissors is maybe a dollar. Bulk steel is 50 cents a pound.
Scissors also don't have to be "consumed", or at least not substantially over the course of a human life. I'd expect them to be durable goods. It's more than possible to make a pair of scissors, for not even that much more that a shit pair, that outlasts the first owner. We just generally choose not to.
It's a lot cheaper to cut the blades out of flat sheet metal and affix the injection molded plastic handles to give a nice 3d sculpted ergonomic handle than it is to cast/forge/machine the ergonomic handle out of steel. For 90 percent of users it will be just fine to use plastic, and the plastic will last for decades without issue.
I don't buy the "but good scissors cost too much to make today" argument.
> Labor has never been more expensive than now.
Each hour of wages is amortized across dozens, hundreds, possibly thousands of scissors. You could double the wages of everyone involved from manufacturing to fulfillment to logistics to sales and it'd translate to maybe multiple pennies' worth of a per-scissor cost increase.
> Raw resources have never been under more competition than now.
As the other commenter pointed out, steel's still plenty cheap and abundant.
> We as a species are more numerous and consume more scissors than ever before.
That's more than offset by our increased ability to make scissors at scale.
> Supply chains have never farther separated end user from producer, as now.
The whole point of that separation is to drive down costs.
----
All in all, the available information points rather squarely to manufacturers being greedy rather than costs somehow forcing their hand.
The one review I found with the word "melting," or anything conceptually close to this, was "My Favorite Scissors have Died!" The person is not describing melting in the heat or deformation sense, which I think a lot of people would have assumed. Rather, they're exuding a liquid, like a lot of old rubber materials. To be generous, I'll wonder if maybe it's flammable.
> Don't use the term "hunt" in the name, to me it just screams "dupe" instead of something unique.
> [...]
> You can have "have you launched yet?", "find the next best thing", "what's trending today"...etc
OP: I may or may not be in possession of hasitlaunched dot com, should you be interested in purchasing it…
> The only time exponential backoff is useful is if the failure is due to a rate limit and you specifically need a mechanism to reduce the rate at which you are attempting to use it.
That's what you should be using exponential backoff for. In actuality, the new latency introduced by the backoff should be maintained for some time even after a successful request has been received, and gradually over time the interval reduced.
> I much prefer setting a fixed time period to wait between retries (would you call that linear backoff? no backoff?)
I've heard it referred to as truncated exponential backoff.
This place has both changed a _lot_ and also very little, depending on which axis you want to analyze. One thing that has been pretty consistent, however, is the rather minimal amount of trolls/bots. There are some surges from time to time, but they really don't last that long.
I mostly agree. Still, being a minority from Montreal who left and found life in Japan significantly less alienating, I hesitate to agree wholeheartedly.
> Er, not really. Not meaningfully. They can be freely compared with timezone-aware timestamps
Author here. My original post was a little ambiguous on this topic; I've updated it to make it clearer.
The tl;dr is that in Python, `time.time()` calls the c stdlib `time` function (at least in CPython), which follows the POSIX standard. It turns out that POSIX standard does _not_ mention timezones at all: https://pubs.opengroup.org/onlinepubs/9699919799/functions/t...
To wit, you can't actually assume that timestamps are UTC in Python, which is a different kind of insanity:
> To wit, you can't actually assume that timestamps are UTC in Python, which is a different kind of insanity:
The code you’ve put in this comment is some… utter nonsense, sorry! Not sure how else to put it. And you’ve come to a wrong conclusion as a result. And the wrong conclusion does not even make sense.
When you call .replace(), you’re constructing the Toronto time which has the same local time as the current UTC time. This MUST have a different timestamp than the current UTC time.
Basically, it is 5:00 PM right now in Toronto, UTC-4, which is 9:00 PM UTC. Your code is “Give me the current local time UTC (9:00 PM), replace the time zone with Toronto (9:00 PM Toronto), and then give me the timestamp for that.” The result is nothing more than a timestamp four hours in the future. You’ve managed to construct a sequence of API calls that adds four hours to the current time.
Some problems:
1. You SHOULD know better than to call .utcnow(). When you try it, Python will print out this message:
> DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).
2. In general, .replace() CHANGES the datetime object into a different one, which represents a different moment in time. If you want the same moment in time but a different time zone, you want .astimezone(), which gives you the SAME moment in time, but with a different timezone.
3. Timestamps are safe to compare. They represent instants in time and you do not need to worry about timezone conversions when you work with timestamps.
4. The notion that “you can't actually assume that timestamps are UTC in Python” is nonsensical, because timestamps don’t have timezones! A timestamp isn’t UTC in the first place. It simply represents an instant in time.
I really think there are some fundamental misconceptions at work here. I am hoping that this comment will help bust some of your misconceptions and help you arrive at a more correct understanding of how time works:
- datetime objects can be timezone naïve or timezone aware,
- timestamps are in an entirely separate category, where timezones do not matter at all.
I recommend looking at something like Java’s JSR-310 library for more information about how a well-designed API would look. In particular, think about how Instant, LocalDateTime, OffsetDateTime, and ZonedDateTime map to Python classes. The Java classes are more strict in their semantics and can serve somewhat as a point of reference for how you “should” think about time.
I admit that I didn't fully explain the danger, because I thought it was somewhat self-evident: if you attempt to compare two datetimes correctly, you must know their timezones. Anyone who has used Google Calendar to schedule a meeting in SF when they live in NYC can attest to that.
> You only need a timezone if there was some kind of locality to the event (e.g.: when _and_ where this happened), but usually you can convert to local time. So storing UTC time is fine, adding the timezone in rare cases.
We're both saying the same thing, here.
While UTC is not _technically_ a timezone, most databases/programming languages/etc. let you attach UTC to a datetime. When you say that you're "storing UTC time", you are implicitly storing it with enough information.
The difference is that I don't like the implicit nature of assuming a timezone-naive datetime is UTC, because that is a very dangerous assumption to make. Just store the timezone, even if it's UTC!
It's a pretty solid sign of an inexperienced developer when they think UTC needs to always be listed as the timezone. Most people with more experience know that if there's no timezone attached it should always be populated in, and assumed to already be in, UTC.
That said, it's rarely mentioned, which is a problem. And as the parent comment mentions, there are valid cases for tracking things in local timezones to retain user-local consistency.
There's a long history of databases using a client-controlled timezone per connection (which defaults to the server's local timezone) and storing only local time. I wouldn't rely on seeing UTC unless the team is very careful about migrating legacy data and consistently using TIMESTAMP WITH TIME ZONE columns and functions.
I think on the first read I didn't get that your article was basically about 'floating times', which upon a second read I totally agree is a weird default and good to avoid!
I thought the same, and went from an Apple standard keyboard to a Moonlander. Some time after that I went to a custom 36-key split ergo board.
It takes a little time to familiarize yourself, but you get very proficient with layers and home-row mods. The main benefit is that every single one of my fingers never has to move more than one key in any direction, and it has markedly improved my wrist health.
The cutoff used to be early 2008, I believe, but that may have changed in the last ~17 years :)