I'm currently working on a mathematical formula that allows me (and when I fully master it, other people too) to create actively changing the lighting on surfaces of the face (something like "bump" or "normal" maps if anyone knows a thing about 3d modeling).
Like when you have a hand that in reality have a cross-section of a triangle, and when you light it from one side that side gets brighter and then the same can happen for the other one.
For that, I'm using
accelerometerRawY() and while testing I came across a possible bug.
I think it is more an oversight than a bug but still.
When one has a face preview on the mobile app on phone/tablet it doesn't change the perspective of accelerometers with the perspective of view.
To give an example: when I have my device in the normal orientation - it works ok, but when it goes to portrait mode (vertical view) all reading stays the same and movement of the device (which have accelerometers set to work with widescreen mode) doesn't match the reality.
Any chance this could be fixed?
It's just adding "-" to the equations based on the orientation...