> Signal insists on using your phone number too, refusing user ids or anything that will make analysis hard.
That is no longer true, you can use user IDs now.
For the other problem, you can enable self-deleting messages in group chats, limiting the damage when a chat does become compromised. Of course, this doesn't stop any persistent threat, such as law enforcement (is that even the right term anymore?) getting access to an unlocked phone.
It doesn't mean much if it isn't the default, even then people who got it prior to that use phone numbers, you can protect yourself maybe, but not other people in the group. But it's good they're doing this now.
Right, but I think that the Recycle Bin is exactly what is causing the issue here. Users have been taught for decades that if they delete something, it is not really gone, as they can always just go back to their Recycle Bin or Deleted Items folder and restore it. (I have worked with clients that used the Deleted Items folder in Outlook as an archive for certain conversations, and would regularly reference it.)
So users have been taught that the term "delete" means "move somewhere out of my sight". If you design a UI and make "delete" mean something completely different from what everyone already understands it to mean, the problem is you, not the user.
> Users have been taught for decades that if they delete something, it is not really gone
There are stories all over the internet involving people who leave stuff in their recycle bin or deleted items and then are shocked when it eventually gets purged due to settings or disk space limits or antivirus activity or whatever.
Storing things you care about in the trash is stupid behavior and I hope most of these people learned their lessons after the one time. But recycle bin behavior is beneficial to a much larger set of people, because accidental deletion is common, especially for bulk actions. “Select all these blurry photos, Delete, Confirm, Oh, no! I accidentally deleted the last picture of my Grandma!”
Recycle bin behavior can also make deletion smoother because it allows a platform to skip the Confirm step since it’s reversible.
What you describe is basically event sourcing, which is definitely popular. However, for OLAP, you will still want a copy of your data that only has the actual dimensions of interest, and not their history - and the easiest way to create that copy and to keep it in sync with your events is via triggers.
Business processes and the database systems I described (and built) have existed before event sourcing was invented. I had built what is essentially event sourcing using nothing more than database tables, views, and stored procedures.
Well, Microsoft SQL Server has built-in Temporal Tables [1], which even take this one step further: they track all data changes, such that you can easily query them as if you were viewing them in the past. You can not only query deleted rows, but also the old versions of rows that have been updated.
(In my opinion, replicating this via a `validity tstzrange` column is also often a sane approach in PostgreSQL, although OP's blog post doesn't mention it.)
MariaDB has system-versioned tables, too, albeit a bit worse than MS SQL as you cannot configure how to store the history, so they're basically hidden away in the same table or some partition: https://mariadb.com/docs/server/reference/sql-structure/temp...
This has, at least with current MariaDB versions, the annoying property that you really cannot ever again modify the history without rewriting the whole table, which becomes a major pain in the ass if you ever need schema changes and history items block those.
Maria still has to find some proper balance here between change safety and developer experience.
Claude Code already is the purple unicorn. We're already there - the only problem is that regulatory systems are set up in a way that benefits a small minority of capitalists, rather than the majority.
I've thought about doing that, but it seems to require multiple Google accounts - one for the "child" and one for the "parent", which is hard to achieve without also having multiple SIM cards with different phone numbers that can be used for the account registration. I assume the process is designed to be full of friction to prevent people from freeing themselves of the addiction.
You don't need multiple SIM cards; you can just create new Google accounts. I have my main google account set as the child of a different account and it works great. But the set up was somewhat annoying, took me a couple hours.
I've been postponing writing a guide on how to set this up for a while, but I think I'm motivated to do it now, I'll try to have something up by the 8th here https://tim2othy.github.io/ws/screen-time/, maybe useful for you.
I think you would be hard-pressed to find a modern AA game that does not already use a GC. The major game engines Unreal and Unity are garbage collected - although they use manual memory management for some of their internals, the exposed API surface (including the C++ API) is designed with garbage collection in mind.
Notably, the popular-with-hobbyists Godot Engine does not use a garbage collector. It uses reference counting with some objects, but does not provide cycle detection, thus requires all objects to be laid out in a tree structure (which the engine is built around).
I said "a GC", that is, "a garbage collector". Even if you consider reference counting as technically being garbage collection, purely reference counted systems do not have a distinct entity that can be identified as "a" garbage collector. So I'm technically correct here even in face of this pedantry.
Not that I think it's a reasonable approach to language to be pedantic on this. RC being GC is, of course, true from an analytic approach to language: a garbage collection system is defined as a system that collects and frees objects that are unreachable and thus dead; a reference counting pointer collects and frees objects that are unreachable and thus dead; therefore, reference counting is garbage collection.
One problem with this is the vagueness: now, the use of a call stack is garbage collection; after all, returning from a function collects and frees the objects in the stack frame. Leaking memory all over the place and expecting the operation system to clean up when you call `exit()` likewise is "garbage collection".
But more importantly, it's just not how anyone understands the word. You understood perfectly well what I meant when I said "you would be hard-pressed to find a modern AA game that does not already use a GC"; in other words, you yourself don't even understand the word differently. You merely feel an ethical imperative to understand the word differently, and when you failed to do so, used my comment as a stand-in to work through the emotions caused by your own inability to live up to this unfulfilled ethic.
Except they do, when one bothers to read computer science reference literature, instead of blog posts from folks that learned programming on their own way.
Being pedantic is required mechanism to fix urban myths, that is how we end up with he says, she says, adultered knowledge.
All those "garbage collection" variations are exactly the proof what happens when people on the street discuss matters without having a clue about what they are talking about, it is like practice medecine with village recipes "I hear XYZ cures ABC".
It is not vague, IEEE and ACM have plenty of literature on the matter.
They are not interchangeable. The semantics are observably different. Therefore, RC is not GC.
Reference counting gives you eager destruction. GC cannot.
GC gives lets you have garbage cycles. RC does not.
I think a part of the GC crew reclassified RC as GC to try to gain relevance with industry types during a time when GC was not used in serious software but RC was.
But this is brain damage. You can’t take a RC C++ codebase and replace the RC with GC and expect stuff to work. You can’t take a GC’d language impl and replace the GC with RC and expect it to work. Best you could do is use RC in addition to GC so you still keep the GC semantics.
> GC gives lets you have garbage cycles. RC does not.
This is the biggest difference, but if you disallow cycles then they come close. For example, the jq programming language disallows cycles, therefore you could implement it with RC or GC and there would be no observable difference except "eager destruction", but since you could schedule destruction to avoid long pauses when destroying large object piles, even that need not be a difference. But of course this is a trick: disallowing cycles is not a generic solution.
> Reference counting gives you eager destruction. GC cannot.
Tracing GC can't. Reference counting, which is by definition a GC can. It's like insects vs bugs.
And destructors are a specific language feature. No one says that they are a must have and if you don't have them then you can replace an RC with a tracing GC. Not that it matters, a ladybug is not the same as an ant, but they are both insects.
The best part of these conversations is that if I say “garbage collection”, you have zero doubt that I am in fact referring to what you call “tracing garbage collection”.
You are defining reference counting as being a kind of garbage collection, but you can’t point to why you are doing it.
I can point to why that definition is misleading.
Reference counting as most of the industry understands it is based on destructors. The semantics are:
- References hold a +1 on the object they point to.
- Objects that reach 0 are destructed.
- Destruction deletes the references, which then causes them to deref the pointed at object.
This is a deterministic semantics and folks who use RC rely on it.
This is nothing like garbage collection, which just gives you an allocation function and promises you that you don’t have to worry about freeing.
They are different approaches for the same thing: automatic memory management. (Which is itself a not trivial to define concept)
One tracks liveness, while the other tracks "deadness", but as you can surely imagine on a graph of black and white nodes, collecting the whites and removing all the others vs one by one removing the black ones are quite similar approaches, aren't they?
You’re not going to convince me by citing that paper, as it’s controversial in GC circles. It’s more of a spicy opinion piece than a true story.
I agree that RC and GC are both kinds of automatic memory management.
RC’s semantics aren’t about tracking deadness. That’s the disconnect. In practice, when someone says, “I’m using RC”, they mean that they have destructors invoked on count reaching zero, which then may or may not cause other counts to reach zero. If you squint, this does look like a trace - but by that logic everyone writing recursive traversals of data structures is writing a garbage collector
A RC algorithm implementation using a cycle collector, or deferred deletion on a background thread, to reduce stop the world cascade deletion impact, is....
The problem there is probably that Java cannot pass objects by value [1]. That incurs an additional layer of indirection when accessing the individual members of the struct, tanking performance.
That's not a necessity, though - you can use a GC in languages that allow you to control whether structs get allocated on the heap or on the stack, and then you don't have this issue. For example, in Go, structs can be allocated on the stack and passed by value, or they can be allocated on the heap and passed by reference, and this is under the control of the application programmer [2].
[1]: Actually, according to the Java spec, Java does not have pass-by-reference, and objects are always passed by value. However, that's just strange nomenclature - in Java parlance, "object" names the reference, not the actual range of memory on the heap.
[2]: The language spec does not guarantee this, so this is technically implementation-defined behavior. But then, there's really only one implementation of the Go compiler and runtime.
That is no longer true, you can use user IDs now.
For the other problem, you can enable self-deleting messages in group chats, limiting the damage when a chat does become compromised. Of course, this doesn't stop any persistent threat, such as law enforcement (is that even the right term anymore?) getting access to an unlocked phone.