We require the ability to manually fire a non-runtime touch event that all required display objects get told about.
I've tried a couple of things so far, such as firing a Runtime touch event and then checking for all objects that are in that position, and also manually firing non-runtime touch events specifically on each of those intersecting objects but due to events not propagating properly, and the lack of touch IDs etc, these solutions aren't working for us.
What I want to be able to do is basically what Corona must be doing with the mouse in both the simulator and desktop builds, where on press it fires required touch events that all work as if they really are touch events.
Is it possible to hook into this? If not then I'll make a feature request.