Fixes #161 - adds ~30x faster tokenizer.
authorGoalSmashers <jakub@goalsmashers.com>
Sun, 17 Nov 2013 09:31:04 +0000 (10:31 +0100)
committerGoalSmashers <jakub@goalsmashers.com>
Sun, 17 Nov 2013 22:05:51 +0000 (23:05 +0100)
commit81fa8edf72ba40c11a628a0e0d7279523b855574
treea5c85af255d98a4705ad72295bfa91496b841197
parent17a2d15e75f3bffcd193f6c7d8cfc3632e95c020
Fixes #161 - adds ~30x faster tokenizer.

* Splits data into 128 bytes long chunks (rounded to nearest closing parenthesis).
* Won't seek trough the whole document all the time.
History.md
lib/selectors/tokenizer.js