Standardizes parsed tokens.
authorJakub Pawlowicz <contact@jakubpawlowicz.com>
Sun, 11 Dec 2016 10:26:39 +0000 (11:26 +0100)
committerJakub Pawlowicz <contact@jakubpawlowicz.com>
Fri, 16 Dec 2016 10:49:35 +0000 (11:49 +0100)
commit03097a051cfc4f994e8263171a0774e7a21d8063
tree416fe77e6d1a0d82831b4f32cd846211c6181a96
parentda45a94c9f62a957c5a9baf940ba2299967b8632
Standardizes parsed tokens.

This commit unifies tokenized values' structure, i.e.

```
[
  <TOKEN_NAME>,
  <TOKEN_VALUE>,
  <METADATA>
]
```

Why:

* So it's easier to work with those values further down the
  optimizing pipeline and when restoring back to text content.
18 files changed:
lib/optimizer/advanced.js
lib/optimizer/basic.js
lib/optimizer/reduce-non-adjacent.js
lib/optimizer/reorderable.js
lib/optimizer/tidy-block.js
lib/optimizer/tidy-rule-duplicates.js
lib/optimizer/tidy-rules.js
lib/stringifier/helpers.js
lib/stringifier/one-time.js
lib/stringifier/simple.js
lib/stringifier/source-maps.js
lib/tokenizer/token.js
lib/tokenizer/tokenize.js
lib/utils/read-sources.js
test/module-test.js
test/optimizer/extract-properties-test.js
test/properties/wrap-for-optimizing-test.js
test/tokenizer/tokenize-test.js