It irks me endlessly that we live in the worst timeline, where the computer equivalent of fuses and circuit breakers are almost completely unknown. Instead we trust code blindly.
This results in almost all of the situations threads here address.
In a better timeline, everyone has stable and secure OSs on all their devices, and the default is for everything to be locally networked, with optional monitoring from the outside via a data diode.
it's incredibly hard to implement a data diode for PV systems, enemy satellites can modulate light (like a TV remote, but lower baudrate to stay below the noise floor) and an inverter could decode it and respond accordingly.
Sunlight is about 1000 watts/meter^2 on the ground. Do you have any idea how much power you'd have to send to get through a Jewish Space Laser(tm) to even have a -40 db s/(s+n) signal to pull out of the noise through a side-channel attack?
It would be interesting to use a solar panel as a sensor for moonbounce of an IR laser, though. You'd have to try at the almost new moon, with a window of a few hours/month ;-)
You're describing two very different concepts at the start.
A data diode applies to a specific connection. It's easy to have a serial port that goes one way.
Preventing any possible input to an already compromised device is much harder. But if your device isn't already compromised then it won't be looking at the input light levels for commands.
Its quite trivial really, what is needed to capture a weak signal with known modulation from the background is integration time. Think of how deep space light can be captured with digital cameras with "long" exposure times it can reveal light the human eye can't see because the integration time of light on the retina is too short.
Now other light sources will also integrate with time, this is where the modulation scheme comes in. First consider the amount of time you'd have to integrate the noisy signal to raise it above the noise floor. Thats the on time you need. How do we remove background light variations from other sources? Consider a discrete time pre-agreed pseudorandom sequence, that has "0" periods as often as "1" periods. To remove a constant background you take calculate the sum of light intensities of all "1" periods and the sum of all measured intensities of "0" periods. Then you subtract the "0"-sum from the "1"-sum, a constant signal will remove itself, the satellite signal will be summed N times. since your pseudorandom sequence was kept secret, random variations in light (think bird passing by) will not conspire to selectively block light during the "1" periods, so such noise will be uncorrelated with your pseudorandom signal. adding N uncorrelated noises grows by sqrt(N), so the S/N-ratio grows as sqrt(N). These are widely understood methods, an engineer might call it lock-in amplification, a physicist might call it correlation. This is very basic engineering / science knowledge. It's baffling that people consider this "hard" to execute, sure if you're the milk-man in a village this is hard to execute.
This results in almost all of the situations threads here address.
In a better timeline, everyone has stable and secure OSs on all their devices, and the default is for everything to be locally networked, with optional monitoring from the outside via a data diode.