use RCTEventDispatcher for touch events (6/7)

Summary:
This diff finally makes touch events to be emitted to js from the same object as scroll events. Thanks to changes in previous diffs this means the js will now process them in the right order.

This diff on its own would cause us to send events to js on every frame while dragging a finger on the screen. This is something we tried to avoid with event coalescing in the past, and gets fixed in the following diff D2884595.

public
___
**This diff is part of a larger stack. This is a high level overview about what’s going on here.**

This stack of diffs fixes an issue where scroll events and touch events wouldn’t be processed in the correct order on js side.
Current state of world:
* touch events are send to js immediately when they happen (this makes js respond to touches as soon as possible)
* scroll events are buffered, coalesced and send at the beginning of each js frame (coalescing helps in a case where js is busy processing for several native frames and would get all scroll event for that period afterwards)

How did I change this?
1. I’ve made touch events go through the same class every other event (scroll events included) go, RCTEventDispatcher. This gives us a single place to handle all events.
2. I’ve changed RCTEventDispatcher to flush all buffered events every time it gets a touch event before dispatching the touch. This fixes the original ordering issue.
3. I’ve made “touchMove” behave the same way as scroll events do - they are buffered and coalesced. Since “touchMove” events are fired as often as scroll events (while you drag your finger around), doing only 2. would bring back the issue buffering was fixing.
All of this together effectively still keeps the order of events right, avoids overloading js in the important case we care about. The only downside is an increased latency for “touchMove” events, since they are to longer send to js immediately.

(Even better solution would be changing the native->js event passing system to be pull based instead of push based. That way js would always request touches that has happened since the last time it has asked, which would make it get them as soon as it’s possible to process them and native could do coalescing at that point.
However this change has a much bigger scope, so I’m going with this stack of diffs for now.)

Reviewed By: nicklockwood

Differential Revision: D2884593

fb-gh-sync-id: 749a4dc6256f93fde0d8733687394b080ddd4484
This commit is contained in:
Martin Kralik 2016-02-03 05:22:17 -08:00 committed by facebook-github-bot-4
parent 4d83cfbc50
commit 4aaa35a22e
1 changed files with 7 additions and 5 deletions

View File

@ -15,6 +15,7 @@
#import "RCTBridge.h"
#import "RCTEventDispatcher.h"
#import "RCTLog.h"
#import "RCTTouchEvent.h"
#import "RCTUIManager.h"
#import "RCTUtils.h"
#import "UIView+React.h"
@ -23,7 +24,7 @@
// module if we were to assume that modules and RootViews had a 1:1 relationship
@implementation RCTTouchHandler
{
__weak RCTBridge *_bridge;
__weak RCTEventDispatcher *_eventDispatcher;
/**
* Arrays managed in parallel tracking native touch object along with the
@ -46,7 +47,7 @@
if ((self = [super initWithTarget:self action:@selector(handleGestureUpdate:)])) {
_bridge = bridge;
_eventDispatcher = [bridge moduleForClass:[RCTEventDispatcher class]];
_dispatchedInitialTouches = NO;
_nativeTouches = [NSMutableOrderedSet new];
_reactTouches = [NSMutableArray new];
@ -196,9 +197,10 @@ typedef NS_ENUM(NSInteger, RCTTouchEventType) {
[reactTouches addObject:[touch copy]];
}
eventName = RCTNormalizeInputEventName(eventName);
[_bridge enqueueJSCall:@"RCTEventEmitter.receiveTouches"
args:@[eventName, reactTouches, changedIndexes]];
RCTTouchEvent *event = [[RCTTouchEvent alloc] initWithEventName:eventName
reactTouches:reactTouches
changedIndexes:changedIndexes];
[_eventDispatcher sendEvent:event];
}
#pragma mark - Gesture Recognizer Delegate Callbacks