light darkmode

systematic checkmate

In physics there is the problem of "Gimbal Lock", which is problem for satelites.

Gimbal lock is the loss of one degree of freedom in a multi-dimensional mechanism at certain alignments of the axes.
https://en.wikipedia.org/wiki/Gimbal_lock .

Basically, a set of moves can be performed where the fitted equiment can no longer change the orientation of the satelite in certain ways.

other mechanical and also logic systems can put themselves into a configuration from where they can no longer free themselves without external input.

For logical systems, this is usually, because the designer of the system either didn't consider the consequence of the move or was unaware of the "trap" or that his design would result in such a trap. In programming, this would be the "infinite while loop", e.g. in python:

while True:
	pass

Because systems of rules or just with rules, like laws, are logical systems, they have the risk of encoutering such a "checkmate" situation. They usually reserve some amount of sovereign power to some part of the system to lift this situation. But this brings with it, a new set of challenges and problems:

Sometimes the cause is making assumptions about the function of the system that should enable self correcting behavior, but actually the assumptions aren't true.

The term I may be looking for could be https://en.wikipedia.org/wiki/Attractor