As we work together I know you know this answer, but I just wanted to put it up such that fellow users of Stackoverflow can have it as well.
We have found suggestive comments that maybe the touch screen drivers for most devices do not provide this data to the system.
We originally tested: Samsung Galaxy S2, HTC One, Nexus 5 (by LG) and Nexus 7 (by Asus), Samsung Galaxy Tap 3.
As opposed to the others the Samsung Galaxy Tap 3, gave different values for getTouchMajor() and getTouchMinor(), but this relationship was seen getTouchMajor() = getTouchMinor() * 3, and getOrientation() was always 0, as with all the other devices.
About 2 months later we discovered that the Google Nexus 10 can show an ellipse with a direction line, when you activate Input: Pointer Location under developer settings.
The initial conclusion was that most devices do not support, getTouchMajor(), getTouchMinor(). or getOrientation(), Which could be a limitation of the capacitative touch screens.
But seeing the Nexus 10, and the tracking of the ellipse and orientation gives hope for new interaction design.
It indicates to me that some devices do deliver on getTouchMinor and getTouchMajor as well as orientation. (or historical versions of the same functions).
I have not had the chance to code anything for the device myself but it seems plausible.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…