That's a myth, measured with respect to full scale. I've only tested one camera with significant highlight nonlinearity before clipping, and that camera only did it at one ISO setting. I think the myth got started because of the different ways that camera's metering systems are set up.
As I said in the OP, I'm ignoring read noise here. And precision doesn't play a part in this graphic if the ADC precision is12 bits or more, so I'm not understanding what you're getting at here.
I wonder if that myth was started by including highlight reconstruction, i.e., the camera’s behavior after clipping of at least one channel. I have observed different amount of highlight reconstruction potential with different cameras (Leica M), but have not looked into it in more details. Highlight recovery is a game of luck as one never knows whether color will be preserved or not. Nonetheless, it is very helpful when exposing incorrectly.
As I understand it, highlight reconstruction is affected in the camera design mainly by the compromise matrix, and thus the CFA. But highlight reconstruction is not performed in camera, but by the raw developer.
Shoulder is what I see ;)
Even more interesting in photo-voltaic mode (used in some technical sensor applications, I use one of those to shoot welding).
With one exception at one ISO setting, all the cameras I've tested are designed so that the ADC clips before the curve above bends over far enough to be photographically useful.
In order to relate the graphic to DR, I'd have to bring in read noise and photon noise, and also pick a SNR threshold. That would make it camera specific. I didn't want to do that. DR is full scale over mean at some threshold SNR, so you can't read DR off the graphic. What the graphic does show is the effect of ISO setting on full scale with full scale measured in lux-seconds of exposure.
For camera-specific DR vs ISO curves, see Bill Claff's PDR charts.