Does a reference to "...the judge, jury and executioner..." really make sense in armed conflict? Is there really a judge or a jury? There isn't really even an executioner, in the sense of a lawful delegate tasked with carrying out the result of adjudication.
One of the reasons armed conflict is bad is there is really no justice in it and no time for justice. Justice starts to be possible when security is established, and security is established through armed conflict or a strong norm not to get into it -- as we see presently in Europe, where many countries with meaningful territorial losses and weird borders (exclaves, &c) have elected to just never settle those things.
Back when decent civilization was a thing, there were rules of engagement, conduct, the pursuit of security, and strategic goals which didn't include active genocide of civilians.
Now, granted, we've witnessed horrible things in wars that don't match up to order and clarity of my previous sentence. But there were end goals that made sense.
Sorry, genocide, apartheid and the establishment of a religious-fascist state at the behest of Israeli ring-wing fascists that wouldn't put a foot wrong in Hitlers RKF, isn't an end goal I'd say justifies the means, ends or anything in between.
The establishment of security to the denial of all else, isn't the only dish on the table.
Hamas chooses to fight in urban locations. And keeps the civilians in place at gunpoint.
Consider an attack for which Israel was blamed for a large number of civilian casualties. Israel had given warning they were going to hit the building, get out. Reality: Hamas ordered all the neighbors to rush to the roof of the building to keep Israel from hitting it. Too slow, they were still inside when the bomb landed.
Even if that were true - and it's likely true in some cases - the disregard Israel has shown has been appalling. You don't get out of this by blaming Hamas. You simply don't. If you believe you do - I'll be writing you off as a disgusting apologist.
I'm not going to rehash the war crimes Israel has committed during the last two years. It's likely a waste of time as you already appear to be said apologist. A useful tool to those I don't see as any different to Nazi expansionists ...
> Back when decent civilization was a thing, there were rules of engagement, conduct, the pursuit of security, and strategic goals which didn't include active genocide of civilians.
What period of human history are you referring to exactly?
Lasers need a straight path through clean air. Israel is a favorable location because Tel Aviv gets 200 or so sunny days a year, but if there are clouds this won’t work or will have to fire at the last moment.
As for drones, they’ll fly lower to the ground to reduce the line of sight.
For antipersonnel use, guns are perfectly adequate and guns on tracking turrets have been widely deployed (for example, CIWS). The underlying technology is a ballistic calculator and a fast panning turret. Modern ballistic calculators, weather stations (a small device about the size of a cellphone), and good quality ammunition allows for incredible precision with small arms -- hitting something 25cm in diameter at 1000m is something people can do with these tools.
A weapon like this can't really "mass kill" -- it is for point targets -- but we have long had tools that can automatically track and kill. Why don't we employ them to shoot at people? We have the tagging technology, &c, as you mention.
One reason is that positive identification really does matter a lot when designing and developing weapon systems that automatically attack something.
The anti-missile use case is one of the most widespread uses for automatically targeted weapons in part because a missile is easily distinguished from other things that should not be killed: it is small, extremely hot, moves extremely fast, generally up in the air and moves towards the defense system. It is not a bird, a person, or even a friendly aircraft. The worst mistake the targeting system can make is shooting down a friendly missile. If a friendly missile is coming at you, maybe you need to shoot it down anyways...
Drones have a different signature from a missile and recognizing them in a way that doesn't confuse them with a bird, a balloon, &c, is different from recognizing missiles -- but here again, the worse thing that happens is you shoot down a friendly drone.
And note an advantage to lasers--when you fire ordinary stuff it falls back. C-RAM is specifically designed that misses detonate while still in the air, but no munition has a 100% fusing rate, you get duds. Nothing falls back from the laser.
CWIS is pretty massive, not that this isn't still big, but I think this is taking a miniaturization turn, is upping the accuracy and number of engagements it can handle significantly and potentially upping the range especially in urban environments. CWIS in an urban environment would cause chaos and a lot of collateral damage to buildings but you can now be very sure that only your intended target is being hit so people could die without all the optics of buildings crashing down. It is much easier to have a war when the cameras don't see the destruction. Positive ID is huge, if you really care about it, but even with perfect positive ID if a government is ok with genocide then everyone is a valid target. Are you a male older than 13? You are a combatant and will be killed once you are in sight. Did someone help you in any way (like your mother of family giving you food?) They are also combatants. It is unfortunately not a stretch with modern tools to see this happening in real time. This weapon is, unfortunately, on an inevitable path.
CIWS is big but this has nothing to do with it -- it's actually easier to make a small turret, and small arms precision has been well understood for a long time. Put a 6.5mm Creedmoor on a computer controlled turret -- 6.5mm Creedmoor is generally accepted to be usable to 1km or more.
Range is limited in urban environments because of obstructions -- even the range of CIWS is far too great to be useful.
There hasn't been a real possibility for a long time, I don't think -- it's just not an easy use case.
Are you a male older than 13? You are a combatant and will be killed once you are in sight.
This is exactly the kind of thing that is unworkable.
(A) You don't want to shoot all those people. It's rare if ever the case that even 10% of those males are actually combatants. Even in Germany at the end of the WW2, I doubt it was that high.
(B) What if your own people make a breakthrough and take control of an area, and have all these machines with wildly nonspecific rules shooting at them?
Range due to obstacles is greatly overcome with altitude. My point about the 13yo is that -you- think it is unworkable, but a country that doesn't mind the word 'genocide' thinks it is a fine definition. Camera tech quickly went this route right? 'you could mount that camera but we haven't done it and therefor won't' turned into multiple cameras covering every square inch of a city from multiple angles once the tech was easy enough. The 'easy enough' trend is clear here. Miniaturization, precision, ease of maintenance, etc make the reasons this hasn't been done rapidly fall away and make it clear that it will be done. There is a clear argument that is isn't, yet, realistic to be done but this is a clear step in that direction.
I don't think there is a clear argument that it isn't realistic to be done from a technology standpoint -- in other words, I don't think this laser meaningfully changes things from a capabilities standpoint. The necessary miniaturization and precision are available.
Now, you may think I have the facts wrong, here -- that we haven't had the kind of precise turret before, or that we can't deliver small arms ammunition with great precision -- but you don't come out and say that: you haven't said I have bad facts.
If we accept that the technical capabilities have been there for a while, then we need another explanation for what the hold up is. I have offered an alternative, which is that it comes down to doctrine or operational issues -- it's not easy to see how to deploy a weapon system that automatically targets people without creating huge practical problems. I offered two concrete cases in my earlier comment. Here again, you haven't really spoken to them: you haven't said, for example, A is not a problem and here's why not. You have just ignored them.
It is really starting to look like you have a story and you are sticking to it.
It was not that long ago, that most countries regulated products, communications, food, and many other things, even arms & munitions, very, very lightly. In the UK in 1903, a law was passed prohibiting the sale of pistols to children. The UK is a country where adults who have served in the military have a hard time buying a pistol today.
From 1900 onwards, the scope of safety regulation greatly expanded, and the state apparatus necessary to make that regulation stick also expanded. Different countries have gone in different directions with it. The US has a lot less safety than many other countries, but our regulations and regulatory apparatus greatly expanded, too. It's easy to sell safety to voters and with improving technology and information systems, more and more safety was possible.
We are probably approaching a local maxima of some kind in our approach to safety; or maybe we just suffer from a maniacal focus on it. Legislators are ever more willing to set aside the fundamental rights in the name of protecting the vulnerable from harm.
It does stand to reason that all law could be formalized. For example, consider the definition of murder in the first degree from 18 USC § 1111:
"Murder is the unlawful killing of a human being with malice aforethought."
You might say, well, "unlawful" and "malice" are fuzzy concepts; but we can take them to be facts that we input into the model. I guess we could write something like this in Catala:
scope Murder :
definition in_the_1st_degree
under condition is_malice_aforethought and is_unlawful consequence
equals
true
In the calculation of social benefits and taxes, the facts input to the model are generally things like prices, depreciations, costs, areas of offices, percentages and so on, input numerically and sworn to be true. These numbers are then used to calculate an amount due (or in arrears). Performing the calculation in a way that is verified to conform to the law is a big part of the work.
However, in other areas of law, determining the facts is actually where the real work is -- was there malice aforethought? A formalized legal machine could process these facts but it's not a big help. The models would just be a huge list of assumptions that have to be input and a minimal calculation that produces `true` or one of the alternatives of an enum.
One of the reasons armed conflict is bad is there is really no justice in it and no time for justice. Justice starts to be possible when security is established, and security is established through armed conflict or a strong norm not to get into it -- as we see presently in Europe, where many countries with meaningful territorial losses and weird borders (exclaves, &c) have elected to just never settle those things.
reply