Home
last modified time | relevance | path

Searched refs:tokenizer (Results 1 – 7 of 7) sorted by relevance

/dokuwiki/vendor/splitbrain/php-jsstrip/
H A Dcomposer.lock274 "theseer/tokenizer": "^1.1.3"
502 "ext-tokenizer": "*",
529 "description": "Wrapper around PHP's tokenizer extension.",
532 "tokenizer"
1374 "name": "theseer/tokenizer",
1378 "url": "https://github.com/theseer/tokenizer.git",
1383 …"url": "https://api.github.com/repos/theseer/tokenizer/zipball/34a41e998c2183e22995f158c581e7b5e75…
1389 "ext-tokenizer": "*",
1412 "issues": "https://github.com/theseer/tokenizer/issues",
1413 "source": "https://github.com/theseer/tokenizer/tree/1.2.1"
/dokuwiki/vendor/splitbrain/slika/
H A Dcomposer.lock282 "theseer/tokenizer": "^1.1.3"
510 "ext-tokenizer": "*",
537 "description": "Wrapper around PHP's tokenizer extension.",
540 "tokenizer"
1438 "name": "theseer/tokenizer",
1442 "url": "https://github.com/theseer/tokenizer.git",
1447 …"url": "https://api.github.com/repos/theseer/tokenizer/zipball/b7489ce515e168639d17feec34b8847c326…
1453 "ext-tokenizer": "*",
1476 "issues": "https://github.com/theseer/tokenizer/issues",
1477 "source": "https://github.com/theseer/tokenizer/tree/1.3.1"
/dokuwiki/inc/
H A Dindexer.php246 return $Indexer->tokenizer($string, $wc);
H A Dfulltext.php954 $words = $Indexer->tokenizer($term_noparen, true);
/dokuwiki/_test/
H A Dcomposer.lock287 "ext-tokenizer": "*",
526 "theseer/tokenizer": "^1.2.3"
2016 "ext-tokenizer": "*",
2080 "name": "theseer/tokenizer",
2084 "url": "https://github.com/theseer/tokenizer.git",
2089 …"url": "https://api.github.com/repos/theseer/tokenizer/zipball/b7489ce515e168639d17feec34b8847c326…
2095 "ext-tokenizer": "*",
2118 "issues": "https://github.com/theseer/tokenizer/issues",
2119 "source": "https://github.com/theseer/tokenizer/tree/1.3.1"
/dokuwiki/inc/Search/
H A DIndexer.php115 $tokens = $this->tokenizer($text);
505 public function tokenizer($text, $wc = false) function in dokuwiki\\Search\\Indexer
/dokuwiki/vendor/geshi/geshi/
H A DCHANGELOG502 * Improve the performance of the strict mode tokenizer, making highlighting of languages like