That's how I read this too. The publisher invested a non-trivial amount of work and was left with nothing, for no better reason than the author changed their mind. From the tone of the post, the author seems to not realize or care.
The publisher changed their mind, too. I don’t think the author was pointing fingers, just sharing information that his readers might enjoy or find useful.
That was a lot of words to get to the point that Fleming probably misremembered the sequence of events when he retold the story 15 years later. He even mentioned this possibility at the time. Interesting article but not much of a mystery.
Seems like you would at the very least need a fairly thick application layer on top of Postgres to make it look and act like a messaging system. At that point, seems like you have just built another messaging system.
Unless you're a five man shop where everybody just agrees to use that one table, make sure to manage transactions right, cron job retention, YOLO clustering, etc. etc.
Performance is probably last on the list of reasons to choose Kafka over Postgres.
There's a lot of logic involved client side regarding managing read cursors and marking events as processed consumer side. Possibly also client side error queues and so on.
I truly miss a good standard client side library following the Kafka-in-SQL philosophy. I started on in my previous job and we used it internally but it never got good enough that it would be widely used elsewhere, and now I work somewhere else...
(PS: Talking about the pub/sub Kafka-like usecase, not the work queue FOR UPDATE usecase)
Ugh, I've had the exact same problem in a Java project, which meant I had to go through thousands and thousands of lines of code and make sure that all 'toLowerCase()' on enum names included Locale.ENGLISH as parameter.
As the article demonstrates, the error manifests in a completely inscrutable way. But once I saw the bug from a couple of users with Turkish sounding names, I zeroed in on it. And cursed a few times under my breath whoever messed up that character table so bad.
They do. But a generic warning about locale-dependence doesn't really tell you that ASCII-strings will be broken.
For nearly every purpose ASCII is the same in every locale. If you have a string that is guaranteed to be ASCII (like an enum constant is in most code styles), it's easy to think "not a problem here" and move on.
Unchecked exceptions are more like a shutdown event, which can be intercepted at any point along the call stack, which is useful and not like a return type.
Debugging. It's one of the most useful tools for narrowing down where an error is coming from and by far the biggest negative of Rust's Result-type error handling in my experience (panics can of course give a callstack but because of the value-based error being most commonly used this often is far away from the actual error).
(it is in principle possible to construct such a stack, potentially with more context, with a Result type, but I don't know of any way to do so that doesn't sacrifice a lot of performance because you're doing all the book-keeping even on caught errors where you don't use that information)
The instrumentation and observability are more heavyweight than the overhead of unwinding the stack which is already keeping track of the most important information (in most mainstream langauges, at least. And even if you don't have a contiguous stack there's usually still the same information around at the point an error is created, assuming that you have something like functions that are returning into other functions. Exceptions, as a model, basically allow the code that raises an error to determine where the error is going to be caught without unwinding and removing the information that lets you track from the top level to where the error was raised). It is still tradeoff, of course (returning errors is more expensive than success), but it's one in a much better place in practice than other options (as obvious by the fact that errors-as-values implementations rarely keep this information around, especially not by default)
Those are not Java paradigms per se - they're just practices of mediocre enterprise developers, of which Java has many, owing to being such a mainstream platform.
As for the language itself, a lot of verbosity has been culled in later language versions, with diamond operators, lambda expressions, local variable type inference, etc.
In either case, code is read more than it is written, so if a bit of verbosity makes code easier to read, I'm okay with it.
Yes. A real, raw source is almost certainly better than a processed and treated source. Packaged fish oil is sometimes rancid, unnoticeable to you, and those oxidation products are harmful. It is also heavily loaded with antioxidants, without which fresh fish oil goes rancid within hours, and which have their own detrimental effects when in excess.
reply