* part of #80, support mask input options
* close#188 enhance sampling options
Use a more general sampling strategy interface to describe the
configuration of sampling events collection.
Implemented mousmove, mouse interaction, scroll and input sampling
strategy.
- What was broken was that it would just play activity from the first page view, but then would stop at the second page view (meta) as actions after that had been discarded
- This restores the functionality given by the comment 'return the events from last meta to the end.' - we never want to discard events that are after the baseline time
- I believe 'session' is the incorrect terminology for this function name, as a session in web analytics usually means a series of page views
related to #6
Since the currently 'play at any time offset' implementation is pretty simple,
there are many things we can do to optimize its performance.
In this patch, we do the following optimizations:
1. Ignore some of the events during fast forward.
For example, when we are going to fast forward to 10 minutes later,
we do not need to perform mouse movement events during this period.
2. Use a fragment element as the 'virtual parent node'.
So newly added DOM nodes will be appended to this fragment node,
and finally being appended into the document as a batch operation.
These changes reduce a lot of time which was spent on reflow/repaint previously.
I've seen a 10 times performance improvement within these approaches.
And there are still some things we can do better but not in this patch.
1. We can build a virtual DOM tree to store the mutations of DOM.
This will minimize the number of DOM operations.
2. Another thing that may help UX is to make the fast forward process async and cancellable.
This may make the drag and drop interactions in the player's UI looks smooth.
On recordings with many full pageloads, dom state and mutations were being applied only to be discarded when a new pageload came in, resulting in very slow time to rebuild - and inability to interactively 'scrub' through these recordings
According to @eoghanmurray's suggestion, we can support three
main scenarios:
1. record only
2. replay only
3. all in one
Since we have implemented the packer feature, which has a big
influence in bundle size, we provide another three bundles:
1. record and pack
2. replay and unpack
3. all in one with pack and unpack
* Move mutation processing into it's own object.
This should stand on it's own as a refactor, but is intended as a basis
for exposing the new MutationBuffer object to further outside control e.g.
to 'mute' or batch up mutation emission when the page becomes inactive
from a https://developer.mozilla.org/en-US/docs/Web/API/Page_Visibility_API
point of view
* The `processMutations` function needed to be bound to the `mutationBuffer` object, as otherwise `this` referred to the `MutationObserver` object itself
* Neglected to add this output of `npm run typings`
* Get around the binding problem by using Arrow function expressions
* Prettier formatting
* refactoring play, pause, resume, load style sheet to subscribe style code
* support live mode in state machine
* 1. upgrade @xstate/fsm
2. add toggle interact methods to the player
There are some long-term issues in rrweb's mutation observer.
A scenario cause problem:
A list of DOM node: n1, n2, n3, n4, n5
Steps of modifying the nodes:
1. remove n1, n2, n3, n4 sequentially
2. append n4, n3, n2, n1 after n5 sequentially
Then we got the added node data like this:
(id: n4, prev: null, next: n3 )
(id: n3, prev: n4, next: n2 )
(id: n2, prev: n3, next: n1 )
(id: n1, prev: n2, next: null)
The problem comes when we try to replay the first add node datum.
Since its prev node is null, we rely on its next sibling n3. But
n3 was not present at this moment, and in previous code, we fallback
to append n4 to the last of its parent node.
The solution is to defer the append of elements that missing
siblings. But it is also hard to tell which node is the first one
that needs to be appended.
Take a step back and rethink the design of the mutation observer,
we've found there are two implementations make things complicated.
1. We set the id to -1 when we seeing some nodes are not serialized yet.
2. We record both previous sibling and next sibling to determine the
position of the node.
But we can do better!
First, we can put nodes with un-serialized siblings
to a queue, and try to add it again later. Then we can just record the next
sibling as 'the single truth' so we can be sure which node is the last
one of its parent.
This patch has implemented the new observer strategy. Data recorded with
the new observer should no longer have any node with id -1. But for
compatibility consideration, we still keep some replayer code that helps
solve legacy data.
* introduce pako and add general packer interface
* add tests for packer
* use function API instead of class API for better tree shaking support
* refcatoring the rollup bundle config