1 Commits

Author SHA1 Message Date
56c2b4599a fix: HTML formatter was setting bold wrong
Some checks are pending
Tests / build (push) Waiting to run
2024-09-21 12:56:40 -03:00
46 changed files with 242 additions and 1617 deletions

View File

@ -9,7 +9,7 @@ permissions:
contents: read
jobs:
build:
runs-on: ubuntu-22.04
runs-on: ubuntu-latest
steps:
- name: Download source
uses: actions/checkout@v4

View File

@ -7,7 +7,7 @@ permissions:
contents: read
jobs:
build:
runs-on: ubuntu-22.04
runs-on: ubuntu-latest
steps:
- name: Download source
uses: actions/checkout@v4

3
.gitignore vendored
View File

@ -12,6 +12,3 @@ venv/
.croupier
coverage/
run_tests
# We use the internal crystal lexer
lexers/crystal.xml

View File

@ -2,90 +2,6 @@
All notable changes to this project will be documented in this file.
## [0.12.0] - 2025-01-21
### 🚀 Features
- Bumped to latest chroma release
### ⚙️ Miscellaneous Tasks
- Pin ubuntu version in CI
- Mark more mcfunction tests as bad
### Build
- Automate AUR release
## [0.11.1] - 2024-10-14
### 🐛 Bug Fixes
- Support choosing lexers when used as a library
## [0.11.0] - 2024-10-14
### 🚀 Features
- Support selecting only some themes
## [0.10.0] - 2024-09-26
### 🚀 Features
- Optional conditional baking of lexers
### 🐛 Bug Fixes
- Strip binaries for release artifacts
- Fix metadata to show crystal
## [0.9.1] - 2024-09-22
### 🐛 Bug Fixes
- Terminal formatter was skipping things that it could highlight
- Bug in high-level API for png formatter
### 🧪 Testing
- Added minimal tests for svg and png formatters
## [0.9.0] - 2024-09-21
### 🚀 Features
- PNG writer based on Stumpy libs
### ⚙️ Miscellaneous Tasks
- Clean
- Detect version bump in release script
- Improve changelog handling
## [0.8.0] - 2024-09-21
### 🚀 Features
- SVG formatter
### 🐛 Bug Fixes
- HTML formatter was setting bold wrong
### 📚 Documentation
- Added instructions to add as a dependency
### 🧪 Testing
- Add basic tests for crystal and delegating lexers
- Added tests for CSS generation
### ⚙ Miscellaneous Tasks
- Fix example code in README
## [0.7.0] - 2024-09-10
### 🚀 Features

View File

@ -113,25 +113,3 @@ tasks:
kcov --clean --include-path=./src ${PWD}/coverage ./bin/run_tests
outputs:
- coverage/index.html
loc:
phony: true
always_run: true
dependencies:
- src
commands: |
tokei src -e src/constants/
aur:
phony: true
always_run: true
commands: |
rm -rf aur-{{NAME}}
git clone ssh://aur@aur.archlinux.org/{{NAME}}.git aur-{{NAME}}
sed s/pkgver=.*/pkgver=$(shards version)/ -i aur-{{NAME}}/PKGBUILD
sed s/pkgrel=.*/pkgrel=1/ -i aur-{{NAME}}/PKGBUILD
cd aur-{{NAME}} && updpkgsums && makepkg --printsrcinfo > .SRCINFO
cd aur-{{NAME}} && makepkg -fsr
cd aur-{{NAME}} && git add PKGBUILD .SRCINFO
cd aur-{{NAME}} && git commit -a -m "Update to $(shards version)"
cd aur-{{NAME}} && git push

View File

@ -82,51 +82,6 @@ puts formatter.format("puts \"Hello, world!\"", lexer)
The reason you may want to use the manual version is to reuse
the lexer and formatter objects for performance reasons.
## Choosing what Lexers you want
By default Tartrazine will support all its lexers by embedding
them in the binary. This makes the binary large. If you are
using it as a library, you may want to just include a selection of lexers. To do that:
* Pass the `-Dnolexers` flag to the compiler
* Set the `TT_LEXERS` environment variable to a
comma-separated list of lexers you want to include.
This builds a binary with only the python, markdown, bash and yaml lexers (enough to highlight this `README.md`):
```bash
> TT_LEXERS=python,markdown,bash,yaml shards build -Dnolexers -d --error-trace
Dependencies are satisfied
Building: tartrazine
```
## Choosing what themes you want
Themes come from two places, tartrazine itself and [Sixteen](https://github.com/ralsina/sixteen).
To only embed selected themes, build your project with the `-Dnothemes` option, and
you can set two environment variables to control which themes are included:
* `TT_THEMES` is a comma-separated list of themes to include from tartrazine (see the styles directory in the source)
* `SIXTEEN_THEMES` is a comma-separated list of themes to include from Sixteen (see the base16 directory in the sixteen source)
For example (using the tartrazine CLI as the project):
```bash
$ TT_THEMES=colorful,autumn SIXTEEN_THEMES=pasque,pico shards build -Dnothemes
Dependencies are satisfied
Building: tartrazine
$ ./bin/tartrazine --list-themes
autumn
colorful
pasque
pico
```
Be careful not to build without any themes at all, nothing will work.
## Contributing
1. Fork it (<https://github.com/ralsina/tartrazine/fork>)

View File

@ -7,10 +7,10 @@ docker run --rm --privileged \
# Build for AMD64
docker build . -f Dockerfile.static -t tartrazine-builder
docker run -ti --rm -v "$PWD":/app --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release && strip bin/tartrazine"
docker run -ti --rm -v "$PWD":/app --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release"
mv bin/tartrazine bin/tartrazine-static-linux-amd64
# Build for ARM64
docker build . -f Dockerfile.static --platform linux/arm64 -t tartrazine-builder
docker run -ti --rm -v "$PWD":/app --platform linux/arm64 --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release && strip bin/tartrazine"
docker run -ti --rm -v "$PWD":/app --platform linux/arm64 --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release"
mv bin/tartrazine bin/tartrazine-static-linux-arm64

View File

@ -67,6 +67,7 @@ commit_parsers = [
{ message = "^chore\\(deps.*\\)", skip = true },
{ message = "^chore\\(pr\\)", skip = true },
{ message = "^chore\\(pull\\)", skip = true },
{ message = "^chore\\(ignore\\)", skip = true },
{ message = "^chore|^ci", group = "<!-- 7 -->⚙️ Miscellaneous Tasks" },
{ body = ".*security", group = "<!-- 8 -->🛡️ Security" },
{ message = "^revert", group = "<!-- 9 -->◀️ Revert" },

View File

@ -2,14 +2,14 @@
set e
PKGNAME=$(basename "$PWD")
VERSION=$(git cliff --bumped-version --unreleased |cut -dv -f2)
VERSION=$(git cliff --bumped-version |cut -dv -f2)
sed "s/^version:.*$/version: $VERSION/g" -i shard.yml
git add shard.yml
hace lint test
git cliff --bump -u -p CHANGELOG.md
git cliff --bump -o
git commit -a -m "bump: Release v$VERSION"
hace static
git tag "v$VERSION"
git push --tags
hace static
gh release create "v$VERSION" "bin/$PKGNAME-static-linux-amd64" "bin/$PKGNAME-static-linux-arm64" --title "Release v$VERSION" --notes "$(git cliff -l -s all)"

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -1,165 +0,0 @@
<lexer>
<config>
<name>ATL</name>
<alias>atl</alias>
<filename>*.atl</filename>
<mime_type>text/x-atl</mime_type>
<dot_all>true</dot_all>
</config>
<rules>
<state name="root">
<rule pattern="(--.*?)(\n)">
<bygroups>
<token type="CommentSingle" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(and|distinct|endif|else|for|foreach|if|implies|in|let|not|or|self|super|then|thisModule|xor)\b">
<token type="Keyword" />
</rule>
<rule pattern="(OclUndefined|true|false|#\w+)\b">
<token type="KeywordConstant" />
</rule>
<rule pattern="(module|query|library|create|from|to|uses)\b">
<token type="KeywordNamespace" />
</rule>
<rule pattern="(do)(\s*)({)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
<token type="Punctuation" />
</bygroups>
</rule>
<rule pattern="(abstract|endpoint|entrypoint|lazy|unique)(\s+)">
<bygroups>
<token type="KeywordDeclaration" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(rule)(\s+)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(helper)(\s+)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(context)(\s+)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(def)(\s*)(:)(\s*)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
<token type="Punctuation" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(Bag|Boolean|Integer|OrderedSet|Real|Sequence|Set|String|Tuple)">
<token type="KeywordType" />
</rule>
<rule pattern="(\w+)(\s*)(&lt;-|&lt;:=)">
<bygroups>
<token type="NameNamespace" />
<token type="TextWhitespace" />
<token type="Punctuation" />
</bygroups>
</rule>
<rule pattern="#&quot;">
<token type="KeywordConstant" />
<push state="quotedenumliteral" />
</rule>
<rule pattern="&quot;">
<token type="NameNamespace" />
<push state="quotedname" />
</rule>
<rule pattern="[^\S\n]+">
<token type="TextWhitespace" />
</rule>
<rule pattern="&#x27;">
<token type="LiteralString" />
<push state="string" />
</rule>
<rule
pattern="[0-9]*\.[0-9]+">
<token type="LiteralNumberFloat" />
</rule>
<rule pattern="0|[1-9][0-9]*">
<token type="LiteralNumberInteger" />
</rule>
<rule pattern="[*&lt;&gt;+=/-]">
<token type="Operator" />
</rule>
<rule pattern="([{}();:.,!|]|-&gt;)">
<token type="Punctuation" />
</rule>
<rule pattern="\n">
<token type="TextWhitespace" />
</rule>
<rule pattern="\w+">
<token type="NameNamespace" />
</rule>
</state>
<state name="string">
<rule pattern="[^\\&#x27;]+">
<token type="LiteralString" />
</rule>
<rule pattern="\\\\">
<token type="LiteralString" />
</rule>
<rule pattern="\\&#x27;">
<token type="LiteralString" />
</rule>
<rule pattern="\\">
<token type="LiteralString" />
</rule>
<rule pattern="&#x27;">
<token type="LiteralString" />
<pop depth="1" />
</rule>
</state>
<state name="quotedname">
<rule pattern="[^\\&quot;]+">
<token type="NameNamespace" />
</rule>
<rule pattern="\\\\">
<token type="NameNamespace" />
</rule>
<rule pattern="\\&quot;">
<token type="NameNamespace" />
</rule>
<rule pattern="\\">
<token type="NameNamespace" />
</rule>
<rule pattern="&quot;">
<token type="NameNamespace" />
<pop depth="1" />
</rule>
</state>
<state name="quotedenumliteral">
<rule pattern="[^\\&quot;]+">
<token type="KeywordConstant" />
</rule>
<rule pattern="\\\\">
<token type="KeywordConstant" />
</rule>
<rule pattern="\\&quot;">
<token type="KeywordConstant" />
</rule>
<rule pattern="\\">
<token type="KeywordConstant" />
</rule>
<rule pattern="&quot;">
<token type="KeywordConstant" />
<pop depth="1" />
</rule>
</state>
</rules>
</lexer>

View File

@ -1,120 +0,0 @@
<lexer>
<config>
<name>Beef</name>
<alias>beef</alias>
<filename>*.bf</filename>
<mime_type>text/x-beef</mime_type>
<dot_all>true</dot_all>
<ensure_nl>true</ensure_nl>
</config>
<rules>
<state name="root">
<rule pattern="^\s*\[.*?\]">
<token type="NameAttribute"/>
</rule>
<rule pattern="[^\S\n]+">
<token type="Text"/>
</rule>
<rule pattern="\\\n">
<token type="Text"/>
</rule>
<rule pattern="///[^\n\r]*">
<token type="CommentSpecial"/>
</rule>
<rule pattern="//[^\n\r]*">
<token type="CommentSingle"/>
</rule>
<rule pattern="/[*].*?[*]/">
<token type="CommentMultiline"/>
</rule>
<rule pattern="\n">
<token type="Text"/>
</rule>
<rule pattern="[~!%^&amp;*()+=|\[\]:;,.&lt;&gt;/?-]">
<token type="Punctuation"/>
</rule>
<rule pattern="[{}]">
<token type="Punctuation"/>
</rule>
<rule pattern="@&#34;(&#34;&#34;|[^&#34;])*&#34;">
<token type="LiteralString"/>
</rule>
<rule pattern="\$@?&#34;(&#34;&#34;|[^&#34;])*&#34;">
<token type="LiteralString"/>
</rule>
<rule pattern="&#34;(\\\\|\\&#34;|[^&#34;\n])*[&#34;\n]">
<token type="LiteralString"/>
</rule>
<rule pattern="&#39;\\.&#39;|&#39;[^\\]&#39;">
<token type="LiteralStringChar"/>
</rule>
<rule pattern="0[xX][0-9a-fA-F]+[Ll]?|\d[_\d]*(\.\d*)?([eE][+-]?\d+)?[flFLdD]?">
<token type="LiteralNumber"/>
</rule>
<rule pattern="#[ \t]*(if|endif|else|elif|define|undef|line|error|warning|region|endregion|pragma|nullable)\b">
<token type="CommentPreproc"/>
</rule>
<rule pattern="\b(extern)(\s+)(alias)\b">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
<token type="Keyword"/>
</bygroups>
</rule>
<rule pattern="(as|await|base|break|by|case|catch|checked|continue|default|delegate|else|event|finally|fixed|for|repeat|goto|if|in|init|is|let|lock|new|scope|on|out|params|readonly|ref|return|sizeof|stackalloc|switch|this|throw|try|typeof|unchecked|virtual|void|while|get|set|new|yield|add|remove|value|alias|ascending|descending|from|group|into|orderby|select|thenby|where|join|equals)\b">
<token type="Keyword"/>
</rule>
<rule pattern="(global)(::)">
<bygroups>
<token type="Keyword"/>
<token type="Punctuation"/>
</bygroups>
</rule>
<rule pattern="(abstract|async|const|enum|explicit|extern|implicit|internal|operator|override|partial|extension|private|protected|public|static|sealed|unsafe|volatile)\b">
<token type="KeywordDeclaration"/>
</rule>
<rule pattern="(bool|byte|char8|char16|char32|decimal|double|float|int|int8|int16|int32|int64|long|object|sbyte|short|string|uint|uint8|uint16|uint32|uint64|uint|let|var)\b\??">
<token type="KeywordType"/>
</rule>
<rule pattern="(true|false|null)\b">
<token type="KeywordConstant"/>
</rule>
<rule pattern="(class|struct|record|interface)(\s+)">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
</bygroups>
<push state="class"/>
</rule>
<rule pattern="(namespace|using)(\s+)">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
</bygroups>
<push state="namespace"/>
</rule>
<rule pattern="@?[_a-zA-Z]\w*">
<token type="Name"/>
</rule>
</state>
<state name="class">
<rule pattern="@?[_a-zA-Z]\w*">
<token type="NameClass"/>
<pop depth="1"/>
</rule>
<rule>
<pop depth="1"/>
</rule>
</state>
<state name="namespace">
<rule pattern="(?=\()">
<token type="Text"/>
<pop depth="1"/>
</rule>
<rule pattern="(@?[_a-zA-Z]\w*|\.)+">
<token type="NameNamespace"/>
<pop depth="1"/>
</rule>
</state>
</rules>
</lexer>

View File

@ -1,53 +0,0 @@
<!--
Lexer for RFC-4180 compliant CSV subject to the following additions:
- UTF-8 encoding is accepted (the RFC requires 7-bit ASCII)
- The line terminator character can be LF or CRLF (the RFC allows CRLF only)
Link to the RFC-4180 specification: https://tools.ietf.org/html/rfc4180
Additions inspired by:
https://github.com/frictionlessdata/datapackage/issues/204#issuecomment-193242077
Future improvements:
- Identify non-quoted numbers as LiteralNumber
- Identify y as an error in "x"y. Currently it's identified as another string
literal.
-->
<lexer>
<config>
<name>CSV</name>
<alias>csv</alias>
<filename>*.csv</filename>
<mime_type>text/csv</mime_type>
</config>
<rules>
<state name="root">
<rule pattern="\r?\n">
<token type="Punctuation" />
</rule>
<rule pattern=",">
<token type="Punctuation" />
</rule>
<rule pattern="&quot;">
<token type="LiteralStringDouble" />
<push state="escaped" />
</rule>
<rule pattern="[^\r\n,]+">
<token type="LiteralString" />
</rule>
</state>
<state name="escaped">
<rule pattern="&quot;&quot;">
<token type="LiteralStringEscape"/>
</rule>
<rule pattern="&quot;">
<token type="LiteralStringDouble" />
<pop depth="1"/>
</rule>
<rule pattern="[^&quot;]+">
<token type="LiteralStringDouble" />
</rule>
</state>
</rules>
</lexer>

View File

@ -3,6 +3,7 @@
<name>Groff</name>
<alias>groff</alias>
<alias>nroff</alias>
<alias>roff</alias>
<alias>man</alias>
<filename>*.[1-9]</filename>
<filename>*.1p</filename>

View File

@ -95,22 +95,19 @@
<rule pattern="[:!#$%&amp;*+.\\/&lt;=&gt;?@^|~-]+">
<token type="Operator"/>
</rule>
<rule pattern="\d+_*[eE][+-]?\d+">
<rule pattern="\d+[eE][+-]?\d+">
<token type="LiteralNumberFloat"/>
</rule>
<rule pattern="\d+(_+[\d]+)*\.\d+(_+[\d]+)*([eE][+-]?\d+)?">
<rule pattern="\d+\.\d+([eE][+-]?\d+)?">
<token type="LiteralNumberFloat"/>
</rule>
<rule pattern="0[oO](_*[0-7])+">
<rule pattern="0[oO][0-7]+">
<token type="LiteralNumberOct"/>
</rule>
<rule pattern="0[xX](_*[\da-fA-F])+">
<rule pattern="0[xX][\da-fA-F]+">
<token type="LiteralNumberHex"/>
</rule>
<rule pattern="0[bB](_*[01])+">
<token type="LiteralNumberBin"/>
</rule>
<rule pattern="\d+(_*[\d])*">
<rule pattern="\d+">
<token type="LiteralNumberInteger"/>
</rule>
<rule pattern="&#39;">

View File

@ -3,7 +3,6 @@
<name>JSON</name>
<alias>json</alias>
<filename>*.json</filename>
<filename>*.jsonc</filename>
<filename>*.avsc</filename>
<mime_type>application/json</mime_type>
<dot_all>true</dot_all>

View File

@ -1,137 +0,0 @@
<lexer>
<config>
<name>Jsonnet</name>
<alias>jsonnet</alias>
<filename>*.jsonnet</filename>
<filename>*.libsonnet</filename>
</config>
<rules>
<state name="_comments">
<rule pattern="(//|#).*\n"><token type="CommentSingle"/></rule>
<rule pattern="/\*\*([^/]|/(?!\*))*\*/"><token type="LiteralStringDoc"/></rule>
<rule pattern="/\*([^/]|/(?!\*))*\*/"><token type="Comment"/></rule>
</state>
<state name="root">
<rule><include state="_comments"/></rule>
<rule pattern="@&#x27;.*&#x27;"><token type="LiteralString"/></rule>
<rule pattern="@&quot;.*&quot;"><token type="LiteralString"/></rule>
<rule pattern="&#x27;"><token type="LiteralString"/><push state="singlestring"/></rule>
<rule pattern="&quot;"><token type="LiteralString"/><push state="doublestring"/></rule>
<rule pattern="\|\|\|(.|\n)*\|\|\|"><token type="LiteralString"/></rule>
<rule pattern="[+-]?[0-9]+(.[0-9])?"><token type="LiteralNumberFloat"/></rule>
<rule pattern="[!$~+\-&amp;|^=&lt;&gt;*/%]"><token type="Operator"/></rule>
<rule pattern="\{"><token type="Punctuation"/><push state="object"/></rule>
<rule pattern="\["><token type="Punctuation"/><push state="array"/></rule>
<rule pattern="local\b"><token type="Keyword"/><push state="local_name"/></rule>
<rule pattern="assert\b"><token type="Keyword"/><push state="assert"/></rule>
<rule pattern="(assert|else|error|false|for|if|import|importstr|in|null|tailstrict|then|self|super|true)\b"><token type="Keyword"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="function(?=\()"><token type="Keyword"/><push state="function_params"/></rule>
<rule pattern="std\.[^\W\d]\w*(?=\()"><token type="NameBuiltin"/><push state="function_args"/></rule>
<rule pattern="[^\W\d]\w*(?=\()"><token type="NameFunction"/><push state="function_args"/></rule>
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/></rule>
<rule pattern="[\.()]"><token type="Punctuation"/></rule>
</state>
<state name="singlestring">
<rule pattern="[^&#x27;\\]"><token type="LiteralString"/></rule>
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="&#x27;"><token type="LiteralString"/><pop depth="1"/></rule>
</state>
<state name="doublestring">
<rule pattern="[^&quot;\\]"><token type="LiteralString"/></rule>
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="&quot;"><token type="LiteralString"/><pop depth="1"/></rule>
</state>
<state name="array">
<rule pattern=","><token type="Punctuation"/></rule>
<rule pattern="\]"><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="local_name">
<rule pattern="[^\W\d]\w*(?=\()"><token type="NameFunction"/><push state="function_params"/></rule>
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="(?==)"><token type="TextWhitespace"/><push state="#pop" state="local_value"/></rule>
</state>
<state name="local_value">
<rule pattern="="><token type="Operator"/></rule>
<rule pattern=";"><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="assert">
<rule pattern=":"><token type="Punctuation"/></rule>
<rule pattern=";"><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="function_params">
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/></rule>
<rule pattern="\("><token type="Punctuation"/></rule>
<rule pattern="\)"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern=","><token type="Punctuation"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="="><token type="Operator"/><push state="function_param_default"/></rule>
</state>
<state name="function_args">
<rule pattern="\("><token type="Punctuation"/></rule>
<rule pattern="\)"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern=","><token type="Punctuation"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="object">
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="local\b"><token type="Keyword"/><push state="object_local_name"/></rule>
<rule pattern="assert\b"><token type="Keyword"/><push state="object_assert"/></rule>
<rule pattern="\["><token type="Operator"/><push state="field_name_expr"/></rule>
<rule pattern="(?=[^\W\d]\w*)"><token type="Text"/><push state="field_name"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="&quot;"><token type="NameVariable"/><push state="double_field_name"/></rule>
<rule pattern="&#x27;"><token type="NameVariable"/><push state="single_field_name"/></rule>
<rule><include state="_comments"/></rule>
</state>
<state name="field_name">
<rule pattern="[^\W\d]\w*(?=\()"><token type="NameFunction"/><push state="field_separator" state="function_params"/></rule>
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/><push state="field_separator"/></rule>
</state>
<state name="double_field_name">
<rule pattern="([^&quot;\\]|\\.)*&quot;"><token type="NameVariable"/><push state="field_separator"/></rule>
</state>
<state name="single_field_name">
<rule pattern="([^&#x27;\\]|\\.)*&#x27;"><token type="NameVariable"/><push state="field_separator"/></rule>
</state>
<state name="field_name_expr">
<rule pattern="\]"><token type="Operator"/><push state="field_separator"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="function_param_default">
<rule pattern="(?=[,\)])"><token type="TextWhitespace"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="field_separator">
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="\+?::?:?"><token type="Punctuation"/><push state="#pop" state="#pop" state="field_value"/></rule>
<rule><include state="_comments"/></rule>
</state>
<state name="field_value">
<rule pattern=","><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="2"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="object_assert">
<rule pattern=":"><token type="Punctuation"/></rule>
<rule pattern=","><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="object_local_name">
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/><push state="#pop" state="object_local_value"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
</state>
<state name="object_local_value">
<rule pattern="="><token type="Operator"/></rule>
<rule pattern=","><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="2"/></rule>
<rule><include state="root"/></rule>
</state>
</rules>
</lexer>

View File

@ -45,7 +45,7 @@
</emitters>
</usingbygroup>
</rule>
<rule pattern="(ACCESS|ADD|ADDRESSES|AGGREGATE|ALIGNED|ALL|ALTER|ANALYSIS|AND|ANY|ARITY|ARN|ARRANGEMENT|ARRAY|AS|ASC|ASSERT|ASSUME|AT|AUCTION|AUTHORITY|AVAILABILITY|AVRO|AWS|BATCH|BEGIN|BETWEEN|BIGINT|BILLED|BODY|BOOLEAN|BOTH|BPCHAR|BROKEN|BROKER|BROKERS|BY|BYTES|CARDINALITY|CASCADE|CASE|CAST|CERTIFICATE|CHAIN|CHAINS|CHAR|CHARACTER|CHARACTERISTICS|CHECK|CLASS|CLIENT|CLOCK|CLOSE|CLUSTER|CLUSTERS|COALESCE|COLLATE|COLUMN|COLUMNS|COMMENT|COMMIT|COMMITTED|COMPACTION|COMPATIBILITY|COMPRESSION|COMPUTE|COMPUTECTL|CONFIG|CONFLUENT|CONNECTION|CONNECTIONS|CONSTRAINT|CONTINUAL|COPY|COUNT|COUNTER|CREATE|CREATECLUSTER|CREATEDB|CREATEROLE|CREATION|CROSS|CSV|CURRENT|CURSOR|DATABASE|DATABASES|DATUMS|DAY|DAYS|DEALLOCATE|DEBEZIUM|DEBUG|DEBUGGING|DEC|DECIMAL|DECLARE|DECODING|DECORRELATED|DEFAULT|DEFAULTS|DELETE|DELIMITED|DELIMITER|DELTA|DESC|DETAILS|DISCARD|DISK|DISTINCT|DOC|DOT|DOUBLE|DROP|EAGER|ELEMENT|ELSE|ENABLE|END|ENDPOINT|ENFORCED|ENVELOPE|ERROR|ERRORS|ESCAPE|ESTIMATE|EVERY|EXCEPT|EXCLUDE|EXECUTE|EXISTS|EXPECTED|EXPLAIN|EXPOSE|EXPRESSIONS|EXTERNAL|EXTRACT|FACTOR|FALSE|FAST|FEATURES|FETCH|FIELDS|FILE|FILTER|FIRST|FIXPOINT|FLOAT|FOLLOWING|FOR|FOREIGN|FORMAT|FORWARD|FROM|FULL|FULLNAME|FUNCTION|FUSION|GENERATOR|GRANT|GREATEST|GROUP|GROUPS|HAVING|HEADER|HEADERS|HISTORY|HOLD|HOST|HOUR|HOURS|HUMANIZED|HYDRATION|ID|IDENTIFIERS|IDS|IF|IGNORE|ILIKE|IMPLEMENTATIONS|IMPORTED|IN|INCLUDE|INDEX|INDEXES|INFO|INHERIT|INLINE|INNER|INPUT|INSERT|INSIGHTS|INSPECT|INT|INTEGER|INTERNAL|INTERSECT|INTERVAL|INTO|INTROSPECTION|IS|ISNULL|ISOLATION|JOIN|JOINS|JSON|KAFKA|KEY|KEYS|LAST|LATERAL|LATEST|LEADING|LEAST|LEFT|LEGACY|LETREC|LEVEL|LIKE|LIMIT|LINEAR|LIST|LOAD|LOCAL|LOCALLY|LOG|LOGICAL|LOGIN|LOWERING|MANAGED|MANUAL|MAP|MARKETING|MATERIALIZE|MATERIALIZED|MAX|MECHANISMS|MEMBERSHIP|MESSAGE|METADATA|MINUTE|MINUTES|MODE|MONTH|MONTHS|MUTUALLY|MYSQL|NAME|NAMES|NATURAL|NEGATIVE|NEW|NEXT|NO|NOCREATECLUSTER|NOCREATEDB|NOCREATEROLE|NODE|NOINHERIT|NOLOGIN|NON|NONE|NOSUPERUSER|NOT|NOTICE|NOTICES|NULL|NULLIF|NULLS|OBJECTS|OF|OFFSET|ON|ONLY|OPERATOR|OPTIMIZED|OPTIMIZER|OPTIONS|OR|ORDER|ORDINALITY|OUTER|OVER|OWNED|OWNER|PARTITION|PARTITIONS|PASSWORD|PATH|PHYSICAL|PLAN|PLANS|PORT|POSITION|POSTGRES|PRECEDING|PRECISION|PREFIX|PREPARE|PRIMARY|PRIVATELINK|PRIVILEGES|PROGRESS|PROTOBUF|PROTOCOL|PUBLIC|PUBLICATION|PUSHDOWN|QUERY|QUOTE|RAISE|RANGE|RATE|RAW|READ|READY|REAL|REASSIGN|RECURSION|RECURSIVE|REDACTED|REDUCE|REFERENCE|REFERENCES|REFRESH|REGEX|REGION|REGISTRY|RENAME|REOPTIMIZE|REPEATABLE|REPLACE|REPLAN|REPLICA|REPLICAS|REPLICATION|RESET|RESPECT|RESTRICT|RETAIN|RETURN|RETURNING|REVOKE|RIGHT|ROLE|ROLES|ROLLBACK|ROTATE|ROUNDS|ROW|ROWS|SASL|SCALE|SCHEDULE|SCHEMA|SCHEMAS|SECOND|SECONDS|SECRET|SECRETS|SECURITY|SEED|SELECT|SEQUENCES|SERIALIZABLE|SERVICE|SESSION|SET|SHARD|SHOW|SINK|SINKS|SIZE|SMALLINT|SNAPSHOT|SOME|SOURCE|SOURCES|SSH|SSL|START|STDIN|STDOUT|STORAGE|STORAGECTL|STRATEGY|STRICT|STRING|STRONG|SUBSCRIBE|SUBSOURCE|SUBSOURCES|SUBSTRING|SUBTREE|SUPERUSER|SWAP|SYNTAX|SYSTEM|TABLE|TABLES|TAIL|TASK|TEMP|TEMPORARY|TEXT|THEN|TICK|TIES|TIME|TIMELINE|TIMEOUT|TIMESTAMP|TIMESTAMPTZ|TIMING|TO|TOKEN|TOPIC|TPCH|TRACE|TRAILING|TRANSACTION|TRANSACTIONAL|TRIM|TRUE|TUNNEL|TYPE|TYPES|UNBOUNDED|UNCOMMITTED|UNION|UNIQUE|UNKNOWN|UNNEST|UNTIL|UP|UPDATE|UPSERT|URL|USAGE|USER|USERNAME|USERS|USING|VALIDATE|VALUE|VALUES|VARCHAR|VARIADIC|VARYING|VERSION|VIEW|VIEWS|WAIT|WARNING|WEBHOOK|WHEN|WHERE|WINDOW|WIRE|WITH|WITHIN|WITHOUT|WORK|WORKERS|WORKLOAD|WRITE|YEAR|YEARS|YUGABYTE|ZONE|ZONES)\b">
<rule pattern="(ACCESS|ADD|ADDRESSES|AGGREGATE|ALIGNED|ALL|ALTER|ANALYSIS|AND|ANY|ARITY|ARN|ARRANGEMENT|ARRAY|AS|ASC|ASSERT|ASSUME|AT|AUCTION|AUTHORITY|AVAILABILITY|AVRO|AWS|BATCH|BEGIN|BETWEEN|BIGINT|BILLED|BODY|BOOLEAN|BOTH|BPCHAR|BROKEN|BROKER|BROKERS|BY|BYTES|CARDINALITY|CASCADE|CASE|CAST|CERTIFICATE|CHAIN|CHAINS|CHAR|CHARACTER|CHARACTERISTICS|CHECK|CLIENT|CLOSE|CLUSTER|CLUSTERS|COALESCE|COLLATE|COLUMN|COLUMNS|COMMENT|COMMIT|COMMITTED|COMPACTION|COMPATIBILITY|COMPRESSION|COMPUTE|COMPUTECTL|CONFIG|CONFLUENT|CONNECTION|CONNECTIONS|CONSTRAINT|COPY|COUNT|COUNTER|CREATE|CREATECLUSTER|CREATEDB|CREATEROLE|CREATION|CROSS|CSV|CURRENT|CURSOR|DATABASE|DATABASES|DATUMS|DAY|DAYS|DEALLOCATE|DEBEZIUM|DEBUG|DEBUGGING|DEC|DECIMAL|DECLARE|DECODING|DECORRELATED|DEFAULT|DEFAULTS|DELETE|DELIMITED|DELIMITER|DELTA|DESC|DETAILS|DISCARD|DISK|DISTINCT|DOC|DOT|DOUBLE|DROP|EAGER|ELEMENT|ELSE|ENABLE|END|ENDPOINT|ENFORCED|ENVELOPE|ERROR|ERRORS|ESCAPE|ESTIMATE|EVERY|EXCEPT|EXECUTE|EXISTS|EXPECTED|EXPLAIN|EXPOSE|EXPRESSIONS|EXTERNAL|EXTRACT|FACTOR|FALSE|FAST|FEATURES|FETCH|FIELDS|FILE|FILTER|FIRST|FIXPOINT|FLOAT|FOLLOWING|FOR|FOREIGN|FORMAT|FORWARD|FROM|FULL|FULLNAME|FUNCTION|GENERATOR|GRANT|GREATEST|GROUP|GROUPS|HAVING|HEADER|HEADERS|HISTORY|HOLD|HOST|HOUR|HOURS|HUMANIZED|ID|IDENTIFIERS|IDS|IF|IGNORE|ILIKE|IMPLEMENTATIONS|IMPORTED|IN|INCLUDE|INDEX|INDEXES|INFO|INHERIT|INLINE|INNER|INPUT|INSERT|INSIGHTS|INSPECT|INT|INTEGER|INTERNAL|INTERSECT|INTERVAL|INTO|INTROSPECTION|IS|ISNULL|ISOLATION|JOIN|JOINS|JSON|KAFKA|KEY|KEYS|LAST|LATERAL|LATEST|LEADING|LEAST|LEFT|LEGACY|LETREC|LEVEL|LIKE|LIMIT|LINEAR|LIST|LOAD|LOCAL|LOCALLY|LOG|LOGICAL|LOGIN|LOWERING|MANAGED|MANUAL|MAP|MARKETING|MATERIALIZE|MATERIALIZED|MAX|MECHANISMS|MEMBERSHIP|MESSAGE|METADATA|MINUTE|MINUTES|MODE|MONTH|MONTHS|MUTUALLY|MYSQL|NAME|NAMES|NATURAL|NEGATIVE|NEW|NEXT|NO|NOCREATECLUSTER|NOCREATEDB|NOCREATEROLE|NODE|NOINHERIT|NOLOGIN|NON|NONE|NOSUPERUSER|NOT|NOTICE|NOTICES|NULL|NULLIF|NULLS|OBJECTS|OF|OFFSET|ON|ONLY|OPERATOR|OPTIMIZED|OPTIMIZER|OPTIONS|OR|ORDER|ORDINALITY|OUTER|OVER|OWNED|OWNER|PARTITION|PARTITIONS|PASSWORD|PATH|PHYSICAL|PLAN|PLANS|PORT|POSITION|POSTGRES|PRECEDING|PRECISION|PREFIX|PREPARE|PRIMARY|PRIVATELINK|PRIVILEGES|PROGRESS|PROTOBUF|PROTOCOL|PUBLICATION|PUSHDOWN|QUERY|QUOTE|RAISE|RANGE|RATE|RAW|READ|REAL|REASSIGN|RECURSION|RECURSIVE|REDACTED|REFERENCE|REFERENCES|REFRESH|REGEX|REGION|REGISTRY|REHYDRATION|RENAME|REOPTIMIZE|REPEATABLE|REPLACE|REPLAN|REPLICA|REPLICAS|REPLICATION|RESET|RESPECT|RESTRICT|RETAIN|RETURN|RETURNING|REVOKE|RIGHT|ROLE|ROLES|ROLLBACK|ROTATE|ROUNDS|ROW|ROWS|SASL|SCALE|SCHEDULE|SCHEMA|SCHEMAS|SECOND|SECONDS|SECRET|SECRETS|SECURITY|SEED|SELECT|SEQUENCES|SERIALIZABLE|SERVICE|SESSION|SET|SHARD|SHOW|SINK|SINKS|SIZE|SMALLINT|SNAPSHOT|SOME|SOURCE|SOURCES|SSH|SSL|START|STDIN|STDOUT|STORAGE|STORAGECTL|STRATEGY|STRICT|STRING|STRONG|SUBSCRIBE|SUBSOURCE|SUBSOURCES|SUBSTRING|SUBTREE|SUPERUSER|SWAP|SYNTAX|SYSTEM|TABLE|TABLES|TAIL|TEMP|TEMPORARY|TEXT|THEN|TICK|TIES|TIME|TIMELINE|TIMEOUT|TIMESTAMP|TIMESTAMPTZ|TIMING|TO|TOKEN|TOPIC|TPCH|TRACE|TRAILING|TRANSACTION|TRANSACTIONAL|TRIM|TRUE|TUNNEL|TYPE|TYPES|UNBOUNDED|UNCOMMITTED|UNION|UNIQUE|UNKNOWN|UP|UPDATE|UPSERT|URL|USAGE|USER|USERNAME|USERS|USING|VALIDATE|VALUE|VALUES|VARCHAR|VARIADIC|VARYING|VERSION|VIEW|VIEWS|WARNING|WEBHOOK|WHEN|WHERE|WINDOW|WIRE|WITH|WITHIN|WITHOUT|WORK|WORKERS|WRITE|YEAR|YEARS|ZONE|ZONES)\b">
<token type="Keyword" />
</rule>
<rule pattern="[+*/&lt;&gt;=~!@#%^&amp;|`?-]+">

View File

@ -1,137 +1,182 @@
<lexer>
<config>
<name>MCFunction</name>
<name>mcfunction</name>
<alias>mcfunction</alias>
<alias>mcf</alias>
<filename>*.mcfunction</filename>
<mime_type>text/mcfunction</mime_type>
<dot_all>true</dot_all>
<not_multiline>true</not_multiline>
</config>
<rules>
<state name="nbtobjectvalue">
<rule pattern="(&#34;(\\\\|\\&#34;|[^&#34;])*&#34;|[a-zA-Z0-9_]+)">
<token type="NameTag"/>
<push state="nbtobjectattribute"/>
</rule>
<rule pattern="\}">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
</state>
<state name="nbtarrayvalue">
<rule>
<include state="nbtvalue"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
</rule>
<rule pattern="\]">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
</state>
<state name="nbtvalue">
<rule>
<include state="simplevalue"/>
</rule>
<rule pattern="\{">
<token type="Punctuation"/>
<push state="nbtobjectvalue"/>
</rule>
<rule pattern="\[">
<token type="Punctuation"/>
<push state="nbtarrayvalue"/>
</rule>
</state>
<state name="argumentvalue">
<rule>
<include state="simplevalue"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
<rule pattern="[}\]]">
<token type="Punctuation"/>
<pop depth="2"/>
</rule>
</state>
<state name="argumentlist">
<rule pattern="(nbt)(={)">
<bygroups>
<token type="NameAttribute"/>
<token type="Punctuation"/>
</bygroups>
<push state="nbtobjectvalue"/>
</rule>
<rule pattern="([A-Za-z0-9/_!]+)(={)">
<bygroups>
<token type="NameAttribute"/>
<token type="Punctuation"/>
</bygroups>
<push state="argumentlist"/>
</rule>
<rule pattern="([A-Za-z0-9/_!]+)(=)">
<bygroups>
<token type="NameAttribute"/>
<token type="Punctuation"/>
</bygroups>
<push state="argumentvalue"/>
</rule>
<rule>
<include state="simplevalue"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
</rule>
<rule pattern="[}\]]">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
</state>
<state name="root">
<rule><include state="names"/></rule>
<rule><include state="comments"/></rule>
<rule><include state="literals"/></rule>
<rule><include state="whitespace"/></rule>
<rule><include state="property"/></rule>
<rule><include state="operators"/></rule>
<rule><include state="selectors"/></rule>
<rule pattern="#.*?\n">
<token type="CommentSingle"/>
</rule>
<rule pattern="/?(geteduclientinfo|clearspawnpoint|defaultgamemode|transferserver|toggledownfall|immutableworld|detectredstone|setidletimeout|playanimation|classroommode|spreadplayers|testforblocks|setmaxplayers|setworldspawn|testforblock|worldbuilder|createagent|worldborder|camerashake|advancement|raytracefog|locatebiome|tickingarea|replaceitem|attributes|spawnpoint|difficulty|experience|scoreboard|whitelist|structure|playsound|stopsound|forceload|spectate|gamerule|function|schedule|wsserver|teleport|position|save-off|particle|setblock|datapack|mobevent|transfer|gamemode|save-all|bossbar|enchant|trigger|collect|execute|weather|teammsg|tpagent|banlist|dropall|publish|tellraw|testfor|save-on|destroy|ability|locate|summon|remove|effect|reload|ban-ip|recipe|pardon|detect|music|clear|clone|event|mixer|debug|title|ride|stop|list|turn|data|team|kick|loot|tell|help|give|flog|fill|move|time|seed|kill|save|item|deop|code|tag|ban|msg|say|tp|me|op|xp|w|place)\b">
<token type="KeywordReserved"/>
</rule>
<rule pattern="(@p|@r|@a|@e|@s|@c|@v)">
<token type="KeywordConstant"/>
</rule>
<rule pattern="\[">
<token type="Punctuation"/>
<push state="argumentlist"/>
</rule>
<rule pattern="{">
<token type="Punctuation"/>
<push state="nbtobjectvalue"/>
</rule>
<rule pattern="~">
<token type="NameBuiltin"/>
</rule>
<rule pattern="([a-zA-Z_]+:)?[a-zA-Z_]+\b">
<token type="Text"/>
</rule>
<rule pattern="([a-z]+)(\.)([0-9]+)\b">
<bygroups>
<token type="Text"/>
<token type="Punctuation"/>
<token type="LiteralNumber"/>
</bygroups>
</rule>
<rule pattern="([&lt;&gt;=]|&lt;=|&gt;=)">
<token type="Punctuation"/>
</rule>
<rule>
<include state="simplevalue"/>
</rule>
<rule pattern="\s+">
<token type="TextWhitespace"/>
</rule>
</state>
<state name="names">
<rule pattern="^(\s*)([a-z_]+)"><bygroups><token type="TextWhitespace"/><token type="NameBuiltin"/></bygroups></rule>
<rule pattern="(?&lt;=run)\s+[a-z_]+"><token type="NameBuiltin"/></rule>
<rule pattern="\b[0-9a-fA-F]+(?:-[0-9a-fA-F]+){4}\b"><token type="NameVariable"/></rule>
<rule><include state="resource-name"/></rule>
<rule pattern="[A-Za-z_][\w.#%$]+"><token type="KeywordConstant"/></rule>
<rule pattern="[#%$][\w.#%$]+"><token type="NameVariableMagic"/></rule>
<state name="simplevalue">
<rule pattern="(true|false)">
<token type="KeywordConstant"/>
</rule>
<rule pattern="[01]b">
<token type="LiteralNumber"/>
</rule>
<rule pattern="-?(0|[1-9]\d*)(\.\d+[eE](\+|-)?\d+|[eE](\+|-)?\d+|\.\d+)">
<token type="LiteralNumberFloat"/>
</rule>
<rule pattern="(-?\d+)(\.\.)(-?\d+)">
<bygroups>
<token type="LiteralNumberInteger"/>
<token type="Punctuation"/>
<token type="LiteralNumberInteger"/>
</bygroups>
</rule>
<rule pattern="-?(0|[1-9]\d*)">
<token type="LiteralNumberInteger"/>
</rule>
<rule pattern="&#34;(\\\\|\\&#34;|[^&#34;])*&#34;">
<token type="LiteralStringDouble"/>
</rule>
<rule pattern="&#39;[^&#39;]+&#39;">
<token type="LiteralStringSingle"/>
</rule>
<rule pattern="([!#]?)(\w+)">
<bygroups>
<token type="Punctuation"/>
<token type="Text"/>
</bygroups>
</rule>
</state>
<state name="resource-name">
<rule pattern="#?[a-z_][a-z_.-]*:[a-z0-9_./-]+"><token type="NameFunction"/></rule>
<rule pattern="#?[a-z0-9_\.\-]+\/[a-z0-9_\.\-\/]+"><token type="NameFunction"/></rule>
</state>
<state name="whitespace">
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
</state>
<state name="comments">
<rule pattern="^\s*(#[&gt;!])"><token type="CommentMultiline"/><push state="comments.block" state="comments.block.emphasized"/></rule>
<rule pattern="#.*$"><token type="CommentSingle"/></rule>
</state>
<state name="comments.block">
<rule pattern="^\s*#[&gt;!]"><token type="CommentMultiline"/><push state="comments.block.emphasized"/></rule>
<rule pattern="^\s*#"><token type="CommentMultiline"/><push state="comments.block.normal"/></rule>
<rule><pop depth="1"/></rule>
</state>
<state name="comments.block.normal">
<rule><include state="comments.block.special"/></rule>
<rule pattern="\S+"><token type="CommentMultiline"/></rule>
<rule pattern="\n"><token type="Text"/><pop depth="1"/></rule>
<rule><include state="whitespace"/></rule>
</state>
<state name="comments.block.emphasized">
<rule><include state="comments.block.special"/></rule>
<rule pattern="\S+"><token type="LiteralStringDoc"/></rule>
<rule pattern="\n"><token type="Text"/><pop depth="1"/></rule>
<rule><include state="whitespace"/></rule>
</state>
<state name="comments.block.special">
<rule pattern="@\S+"><token type="NameDecorator"/></rule>
<rule><include state="resource-name"/></rule>
<rule pattern="[#%$][\w.#%$]+"><token type="NameVariableMagic"/></rule>
</state>
<state name="operators">
<rule pattern="[\-~%^?!+*&lt;&gt;\\/|&amp;=.]"><token type="Operator"/></rule>
</state>
<state name="literals">
<rule pattern="\.\."><token type="Literal"/></rule>
<rule pattern="(true|false)"><token type="KeywordPseudo"/></rule>
<rule pattern="[A-Za-z_]+"><token type="NameVariableClass"/></rule>
<rule pattern="[0-7]b"><token type="LiteralNumberByte"/></rule>
<rule pattern="[+-]?\d*\.?\d+([eE]?[+-]?\d+)?[df]?\b"><token type="LiteralNumberFloat"/></rule>
<rule pattern="[+-]?\d+\b"><token type="LiteralNumberInteger"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><push state="literals.string-double"/></rule>
<rule pattern="&#x27;"><token type="LiteralStringSingle"/><push state="literals.string-single"/></rule>
</state>
<state name="literals.string-double">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&quot;\n]+"><token type="LiteralStringDouble"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><pop depth="1"/></rule>
</state>
<state name="literals.string-single">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&#x27;\n]+"><token type="LiteralStringSingle"/></rule>
<rule pattern="&#x27;"><token type="LiteralStringSingle"/><pop depth="1"/></rule>
</state>
<state name="selectors">
<rule pattern="@[a-z]"><token type="NameVariable"/></rule>
</state>
<state name="property">
<rule pattern="\{"><token type="Punctuation"/><push state="property.curly" state="property.key"/></rule>
<rule pattern="\["><token type="Punctuation"/><push state="property.square" state="property.key"/></rule>
</state>
<state name="property.curly">
<rule><include state="whitespace"/></rule>
<rule><include state="property"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="1"/></rule>
</state>
<state name="property.square">
<rule><include state="whitespace"/></rule>
<rule><include state="property"/></rule>
<rule pattern="\]"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern=","><token type="Punctuation"/></rule>
</state>
<state name="property.key">
<rule><include state="whitespace"/></rule>
<rule pattern="#?[a-z_][a-z_\.\-]*\:[a-z0-9_\.\-/]+(?=\s*\=)"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="#?[a-z_][a-z0-9_\.\-/]+"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="[A-Za-z_\-\+]+"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="&quot;"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="&#x27;"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="-?\d+"><token type="LiteralNumberInteger"/><push state="property.delimiter"/></rule>
<rule><pop depth="1"/></rule>
</state>
<state name="property.key.string-double">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&quot;\n]+"><token type="NameAttribute"/></rule>
<rule pattern="&quot;"><token type="NameAttribute"/><pop depth="1"/></rule>
</state>
<state name="property.key.string-single">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&#x27;\n]+"><token type="NameAttribute"/></rule>
<rule pattern="&#x27;"><token type="NameAttribute"/><pop depth="1"/></rule>
</state>
<state name="property.delimiter">
<rule><include state="whitespace"/></rule>
<rule pattern="[:=]!?"><token type="Punctuation"/><push state="property.value"/></rule>
<rule pattern=","><token type="Punctuation"/></rule>
<rule><pop depth="1"/></rule>
</state>
<state name="property.value">
<rule><include state="whitespace"/></rule>
<rule pattern="#?[a-z_][a-z_\.\-]*\:[a-z0-9_\.\-/]+"><token type="NameTag"/></rule>
<rule pattern="#?[a-z_][a-z0-9_\.\-/]+"><token type="NameTag"/></rule>
<rule><include state="literals"/></rule>
<rule><include state="property"/></rule>
<rule><pop depth="1"/></rule>
<state name="nbtobjectattribute">
<rule>
<include state="nbtvalue"/>
</rule>
<rule pattern=":">
<token type="Punctuation"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
<rule pattern="\}">
<token type="Punctuation"/>
<pop depth="2"/>
</rule>
</state>
</rules>
</lexer>

View File

@ -106,7 +106,7 @@
</bygroups>
<push state="interpol"/>
</rule>
<rule pattern="(&amp;&amp;|&gt;=|&lt;=|\+\+|-&gt;|!=|=|\|\||//|==|@|!|\+|\?|&lt;|\.|&gt;|\*)">
<rule pattern="(&amp;&amp;|&gt;=|&lt;=|\+\+|-&gt;|!=|\|\||//|==|@|!|\+|\?|&lt;|\.|&gt;|\*)">
<token type="Operator"/>
</rule>
<rule pattern="[;:]">

View File

@ -1,59 +0,0 @@
<lexer>
<config>
<name>NSIS</name>
<alias>nsis</alias>
<alias>nsi</alias>
<alias>nsh</alias>
<filename>*.nsi</filename>
<filename>*.nsh</filename>
<mime_type>text/x-nsis</mime_type>
<case_insensitive>true</case_insensitive>
<not_multiline>true</not_multiline>
</config>
<rules>
<state name="root">
<rule pattern="([;#].*)(\n)"><bygroups><token type="Comment"/><token type="TextWhitespace"/></bygroups></rule>
<rule pattern="&#x27;.*?&#x27;"><token type="LiteralStringSingle"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><push state="str_double"/></rule>
<rule pattern="`"><token type="LiteralStringBacktick"/><push state="str_backtick"/></rule>
<rule><include state="macro"/></rule>
<rule><include state="interpol"/></rule>
<rule><include state="basic"/></rule>
<rule pattern="\$\{[a-z_|][\w|]*\}"><token type="KeywordPseudo"/></rule>
<rule pattern="/[a-z_]\w*"><token type="NameAttribute"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="[\w.]+"><token type="Text"/></rule>
</state>
<state name="basic">
<rule pattern="(\n)(Function)(\s+)([._a-z][.\w]*)\b"><bygroups><token type="TextWhitespace"/><token type="Keyword"/><token type="TextWhitespace"/><token type="NameFunction"/></bygroups></rule>
<rule pattern="\b([_a-z]\w*)(::)([a-z][a-z0-9]*)\b"><bygroups><token type="KeywordNamespace"/><token type="Punctuation"/><token type="NameFunction"/></bygroups></rule>
<rule pattern="\b([_a-z]\w*)(:)"><bygroups><token type="NameLabel"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="(\b[ULS]|\B)([!&lt;&gt;=]?=|\&lt;\&gt;?|\&gt;)\B"><token type="Operator"/></rule>
<rule pattern="[|+-]"><token type="Operator"/></rule>
<rule pattern="\\"><token type="Punctuation"/></rule>
<rule pattern="\b(Abort|Add(?:BrandingImage|Size)|Allow(?:RootDirInstall|SkipFiles)|AutoCloseWindow|BG(?:Font|Gradient)|BrandingText|BringToFront|Call(?:InstDLL)?|(?:Sub)?Caption|ChangeUI|CheckBitmap|ClearErrors|CompletedText|ComponentText|CopyFiles|CRCCheck|Create(?:Directory|Font|Shortcut)|Delete(?:INI(?:Sec|Str)|Reg(?:Key|Value))?|DetailPrint|DetailsButtonText|Dir(?:Show|Text|Var|Verify)|(?:Disabled|Enabled)Bitmap|EnableWindow|EnumReg(?:Key|Value)|Exch|Exec(?:Shell|Wait)?|ExpandEnvStrings|File(?:BufSize|Close|ErrorText|Open|Read(?:Byte)?|Seek|Write(?:Byte)?)?|Find(?:Close|First|Next|Window)|FlushINI|Function(?:End)?|Get(?:CurInstType|CurrentAddress|DlgItem|DLLVersion(?:Local)?|ErrorLevel|FileTime(?:Local)?|FullPathName|FunctionAddress|InstDirError|LabelAddress|TempFileName)|Goto|HideWindow|Icon|If(?:Abort|Errors|FileExists|RebootFlag|Silent)|InitPluginsDir|Install(?:ButtonText|Colors|Dir(?:RegKey)?)|Inst(?:ProgressFlags|Type(?:[GS]etText)?)|Int(?:CmpU?|Fmt|Op)|IsWindow|LangString(?:UP)?|License(?:BkColor|Data|ForceSelection|LangString|Text)|LoadLanguageFile|LockWindow|Log(?:Set|Text)|MessageBox|MiscButtonText|Name|Nop|OutFile|(?:Uninst)?Page(?:Ex(?:End)?)?|PluginDir|Pop|Push|Quit|Read(?:(?:Env|INI|Reg)Str|RegDWORD)|Reboot|(?:Un)?RegDLL|Rename|RequestExecutionLevel|ReserveFile|Return|RMDir|SearchPath|Section(?:Divider|End|(?:(?:Get|Set)(?:Flags|InstTypes|Size|Text))|Group(?:End)?|In)?|SendMessage|Set(?:AutoClose|BrandingImage|Compress(?:ionLevel|or(?:DictSize)?)?|CtlColors|CurInstType|DatablockOptimize|DateSave|Details(?:Print|View)|Error(?:s|Level)|FileAttributes|Font|OutPath|Overwrite|PluginUnload|RebootFlag|ShellVarContext|Silent|StaticBkColor)|Show(?:(?:I|Uni)nstDetails|Window)|Silent(?:Un)?Install|Sleep|SpaceTexts|Str(?:CmpS?|Cpy|Len)|SubSection(?:End)?|Uninstall(?:ButtonText|(?:Sub)?Caption|EXEName|Icon|Text)|UninstPage|Var|VI(?:AddVersionKey|ProductVersion)|WindowIcon|Write(?:INIStr|Reg(:?Bin|DWORD|(?:Expand)?Str)|Uninstaller)|XPStyle)\b"><token type="Keyword"/></rule>
<rule pattern="\b(CUR|END|(?:FILE_ATTRIBUTE_)?(?:ARCHIVE|HIDDEN|NORMAL|OFFLINE|READONLY|SYSTEM|TEMPORARY)|HK(CC|CR|CU|DD|LM|PD|U)|HKEY_(?:CLASSES_ROOT|CURRENT_(?:CONFIG|USER)|DYN_DATA|LOCAL_MACHINE|PERFORMANCE_DATA|USERS)|ID(?:ABORT|CANCEL|IGNORE|NO|OK|RETRY|YES)|MB_(?:ABORTRETRYIGNORE|DEFBUTTON[1-4]|ICON(?:EXCLAMATION|INFORMATION|QUESTION|STOP)|OK(?:CANCEL)?|RETRYCANCEL|RIGHT|SETFOREGROUND|TOPMOST|USERICON|YESNO(?:CANCEL)?)|SET|SHCTX|SW_(?:HIDE|SHOW(?:MAXIMIZED|MINIMIZED|NORMAL))|admin|all|auto|both|bottom|bzip2|checkbox|colored|current|false|force|hide|highest|if(?:diff|newer)|lastused|leave|left|listonly|lzma|nevershow|none|normal|off|on|pop|push|radiobuttons|right|show|silent|silentlog|smooth|textonly|top|true|try|user|zlib)\b"><token type="NameConstant"/></rule>
</state>
<state name="macro">
<rule pattern="\!(addincludedir(?:dir)?|addplugindir|appendfile|cd|define|delfilefile|echo(?:message)?|else|endif|error|execute|if(?:macro)?n?(?:def)?|include|insertmacro|macro(?:end)?|packhdr|search(?:parse|replace)|system|tempfilesymbol|undef|verbose|warning)\b"><token type="CommentPreproc"/></rule>
</state>
<state name="interpol">
<rule pattern="\$(R?[0-9])"><token type="NameBuiltinPseudo"/></rule>
<rule pattern="\$(ADMINTOOLS|APPDATA|CDBURN_AREA|COOKIES|COMMONFILES(?:32|64)|DESKTOP|DOCUMENTS|EXE(?:DIR|FILE|PATH)|FAVORITES|FONTS|HISTORY|HWNDPARENT|INTERNET_CACHE|LOCALAPPDATA|MUSIC|NETHOOD|PICTURES|PLUGINSDIR|PRINTHOOD|PROFILE|PROGRAMFILES(?:32|64)|QUICKLAUNCH|RECENT|RESOURCES(?:_LOCALIZED)?|SENDTO|SM(?:PROGRAMS|STARTUP)|STARTMENU|SYSDIR|TEMP(?:LATES)?|VIDEOS|WINDIR|\{NSISDIR\})"><token type="NameBuiltin"/></rule>
<rule pattern="\$(CMDLINE|INSTDIR|OUTDIR|LANGUAGE)"><token type="NameVariableGlobal"/></rule>
<rule pattern="\$[a-z_]\w*"><token type="NameVariable"/></rule>
</state>
<state name="str_double">
<rule pattern="&quot;"><token type="LiteralStringDouble"/><pop depth="1"/></rule>
<rule pattern="\$(\\[nrt&quot;]|\$)"><token type="LiteralStringEscape"/></rule>
<rule><include state="interpol"/></rule>
<rule pattern="[^&quot;]+"><token type="LiteralStringDouble"/></rule>
</state>
<state name="str_backtick">
<rule pattern="`"><token type="LiteralStringDouble"/><pop depth="1"/></rule>
<rule pattern="\$(\\[nrt&quot;]|\$)"><token type="LiteralStringEscape"/></rule>
<rule><include state="interpol"/></rule>
<rule pattern="[^`]+"><token type="LiteralStringDouble"/></rule>
</state>
</rules>
</lexer>

View File

@ -41,14 +41,6 @@
<rule pattern="\b(as|assert|begin|class|constraint|do|done|downto|else|end|exception|external|false|for|fun|function|functor|if|in|include|inherit|initializer|lazy|let|match|method|module|mutable|new|object|of|open|private|raise|rec|sig|struct|then|to|true|try|type|value|val|virtual|when|while|with)\b">
<token type="Keyword"/>
</rule>
<rule pattern="({([a-z_]*)\|)([\s\S]+?)(?=\|\2})(\|\2})">
<bygroups>
<token type="LiteralStringAffix"/>
<token type="Ignore"/>
<token type="LiteralString"/>
<token type="LiteralStringAffix"/>
</bygroups>
</rule>
<rule pattern="(~|\}|\|]|\||\{&lt;|\{|`|_|]|\[\||\[&gt;|\[&lt;|\[|\?\?|\?|&gt;\}|&gt;]|&gt;|=|&lt;-|&lt;|;;|;|:&gt;|:=|::|:|\.\.|\.|-&gt;|-\.|-|,|\+|\*|\)|\(|&amp;&amp;|&amp;|#|!=)">
<token type="Operator"/>
</rule>

View File

@ -51,20 +51,6 @@
<rule pattern = "\#[a-zA-Z_]+\b">
<token type = "NameDecorator"/>
</rule>
<rule pattern = "^\#\+\w+\s*$">
<token type = "NameAttribute"/>
</rule>
<rule pattern = "^(\#\+\w+)(\s+)(\!)?([A-Za-z0-9-_!]+)(?:(,)(\!)?([A-Za-z0-9-_!]+))*\s*$">
<bygroups>
<token type = "NameAttribute"/>
<token type = "TextWhitespace"/>
<token type = "Operator"/>
<token type = "Name"/>
<token type = "Punctuation"/>
<token type = "Operator"/>
<token type = "Name"/>
</bygroups>
</rule>
<rule pattern = "\@(\([a-zA-Z_]+\b\s*.*\)|\(?[a-zA-Z_]+\)?)">
<token type = "NameAttribute"/>
</rule>

View File

@ -1,57 +0,0 @@
<lexer>
<config>
<name>SNBT</name>
<alias>snbt</alias>
<filename>*.snbt</filename>
<mime_type>text/snbt</mime_type>
</config>
<rules>
<state name="root">
<rule pattern="\{"><token type="Punctuation"/><push state="compound"/></rule>
<rule pattern="[^\{]+"><token type="Text"/></rule>
</state>
<state name="whitespace">
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
</state>
<state name="operators">
<rule pattern="[,:;]"><token type="Punctuation"/></rule>
</state>
<state name="literals">
<rule pattern="(true|false)"><token type="KeywordConstant"/></rule>
<rule pattern="-?\d+[eE]-?\d+"><token type="LiteralNumberFloat"/></rule>
<rule pattern="-?\d*\.\d+[fFdD]?"><token type="LiteralNumberFloat"/></rule>
<rule pattern="-?\d+[bBsSlLfFdD]?"><token type="LiteralNumberInteger"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><push state="literals.string_double"/></rule>
<rule pattern="&#x27;"><token type="LiteralStringSingle"/><push state="literals.string_single"/></rule>
</state>
<state name="literals.string_double">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&quot;\n]+"><token type="LiteralStringDouble"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><pop depth="1"/></rule>
</state>
<state name="literals.string_single">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&#x27;\n]+"><token type="LiteralStringSingle"/></rule>
<rule pattern="&#x27;"><token type="LiteralStringSingle"/><pop depth="1"/></rule>
</state>
<state name="compound">
<rule pattern="[A-Z_a-z]+"><token type="NameAttribute"/></rule>
<rule><include state="operators"/></rule>
<rule><include state="whitespace"/></rule>
<rule><include state="literals"/></rule>
<rule pattern="\{"><token type="Punctuation"/><push/></rule>
<rule pattern="\["><token type="Punctuation"/><push state="list"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="1"/></rule>
</state>
<state name="list">
<rule pattern="[A-Z_a-z]+"><token type="NameAttribute"/></rule>
<rule><include state="literals"/></rule>
<rule><include state="operators"/></rule>
<rule><include state="whitespace"/></rule>
<rule pattern="\["><token type="Punctuation"/><push/></rule>
<rule pattern="\{"><token type="Punctuation"/><push state="compound"/></rule>
<rule pattern="\]"><token type="Punctuation"/><pop depth="1"/></rule>
</state>
</rules>
</lexer>

View File

@ -157,20 +157,8 @@
<rule pattern="(continue|returns|storage|memory|delete|return|throw|break|catch|while|else|from|new|try|for|if|is|as|do|in|_)\b">
<token type="Keyword"/>
</rule>
<rule pattern="(assembly)(\s+\()(.+)(\)\s+{)">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
<token type="LiteralString"/>
<token type="Text"/>
</bygroups>
<push state="assembly"/>
</rule>
<rule pattern="(assembly)(\s+{)">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
</bygroups>
<rule pattern="assembly\b">
<token type="Keyword"/>
<push state="assembly"/>
</rule>
<rule pattern="(contract|interface|enum|event|struct)(\s+)([a-zA-Z_]\w*)">
@ -247,7 +235,7 @@
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
<rule pattern="[(),.]">
<rule pattern="[(),]">
<token type="Punctuation"/>
</rule>
<rule pattern=":=|=:">

View File

@ -51,22 +51,6 @@
</rule>
</state>
<state name="tag">
<rule>
<include state="jsx"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
</rule>
<rule pattern="&#34;(\\\\|\\&#34;|[^&#34;])*&#34;">
<token type="LiteralStringDouble"/>
</rule>
<rule pattern="&#39;(\\\\|\\&#39;|[^&#39;])*&#39;">
<token type="LiteralStringSingle"/>
</rule>
<rule pattern="`">
<token type="LiteralStringBacktick"/>
<push state="interp"/>
</rule>
<rule>
<include state="commentsandwhitespace"/>
</rule>
@ -187,7 +171,7 @@
</rule>
<rule pattern="(?=/)">
<token type="Text"/>
<push state="badregex"/>
<push state="#pop" state="badregex"/>
</rule>
<rule>
<pop depth="1"/>

View File

@ -1,107 +0,0 @@
<lexer>
<config>
<name>Typst</name>
<alias>typst</alias>
<filename>*.typ</filename>
<mime_type>text/x-typst</mime_type>
</config>
<rules>
<state name="root">
<rule><include state="markup"/></rule>
</state>
<state name="into_code">
<rule pattern="(\#let|\#set|\#show)\b"><token type="KeywordDeclaration"/><push state="inline_code"/></rule>
<rule pattern="(\#import|\#include)\b"><token type="KeywordNamespace"/><push state="inline_code"/></rule>
<rule pattern="(\#if|\#for|\#while|\#export)\b"><token type="KeywordReserved"/><push state="inline_code"/></rule>
<rule pattern="#\{"><token type="Punctuation"/><push state="code"/></rule>
<rule pattern="#\("><token type="Punctuation"/><push state="code"/></rule>
<rule pattern="(#[a-zA-Z_][a-zA-Z0-9_-]*)(\[)"><bygroups><token type="NameFunction"/><token type="Punctuation"/></bygroups><push state="markup"/></rule>
<rule pattern="(#[a-zA-Z_][a-zA-Z0-9_-]*)(\()"><bygroups><token type="NameFunction"/><token type="Punctuation"/></bygroups><push state="code"/></rule>
<rule pattern="(\#true|\#false|\#none|\#auto)\b"><token type="KeywordConstant"/></rule>
<rule pattern="#[a-zA-Z_][a-zA-Z0-9_]*"><token type="NameVariable"/></rule>
<rule pattern="#0x[0-9a-fA-F]+"><token type="LiteralNumberHex"/></rule>
<rule pattern="#0b[01]+"><token type="LiteralNumberBin"/></rule>
<rule pattern="#0o[0-7]+"><token type="LiteralNumberOct"/></rule>
<rule pattern="#[0-9]+[\.e][0-9]+"><token type="LiteralNumberFloat"/></rule>
<rule pattern="#[0-9]+"><token type="LiteralNumberInteger"/></rule>
</state>
<state name="markup">
<rule><include state="comment"/></rule>
<rule pattern="^\s*=+.*$"><token type="GenericHeading"/></rule>
<rule pattern="[*][^*]*[*]"><token type="GenericStrong"/></rule>
<rule pattern="_[^_]*_"><token type="GenericEmph"/></rule>
<rule pattern="\$"><token type="Punctuation"/><push state="math"/></rule>
<rule pattern="`[^`]*`"><token type="LiteralStringBacktick"/></rule>
<rule pattern="^(\s*)(-)(\s+)"><bygroups><token type="TextWhitespace"/><token type="Punctuation"/><token type="TextWhitespace"/></bygroups></rule>
<rule pattern="^(\s*)(\+)(\s+)"><bygroups><token type="TextWhitespace"/><token type="Punctuation"/><token type="TextWhitespace"/></bygroups></rule>
<rule pattern="^(\s*)([0-9]+\.)"><bygroups><token type="TextWhitespace"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="^(\s*)(/)(\s+)([^:]+)(:)"><bygroups><token type="TextWhitespace"/><token type="Punctuation"/><token type="TextWhitespace"/><token type="NameVariable"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="&lt;[a-zA-Z_][a-zA-Z0-9_-]*&gt;"><token type="NameLabel"/></rule>
<rule pattern="@[a-zA-Z_][a-zA-Z0-9_-]*"><token type="NameLabel"/></rule>
<rule pattern="\\#"><token type="Text"/></rule>
<rule><include state="into_code"/></rule>
<rule pattern="```(?:.|\n)*?```"><token type="LiteralStringBacktick"/></rule>
<rule pattern="https?://[0-9a-zA-Z~/%#&amp;=\&#x27;,;.+?]*"><token type="GenericEmph"/></rule>
<rule pattern="(\-\-\-|\\|\~|\-\-|\.\.\.)\B"><token type="Punctuation"/></rule>
<rule pattern="\\\["><token type="Punctuation"/></rule>
<rule pattern="\\\]"><token type="Punctuation"/></rule>
<rule pattern="\["><token type="Punctuation"/><push/></rule>
<rule pattern="\]"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="[ \t]+\n?|\n"><token type="TextWhitespace"/></rule>
<rule pattern="((?![*_$`&lt;@\\#\] ]|https?://).)+"><token type="Text"/></rule>
</state>
<state name="math">
<rule><include state="comment"/></rule>
<rule pattern="(\\_|\\\^|\\\&amp;)"><token type="Text"/></rule>
<rule pattern="(_|\^|\&amp;|;)"><token type="Punctuation"/></rule>
<rule pattern="(\+|/|=|\[\||\|\]|\|\||\*|:=|::=|\.\.\.|&#x27;|\-|=:|!=|&gt;&gt;|&gt;=|&gt;&gt;&gt;|&lt;&lt;|&lt;=|&lt;&lt;&lt;|\-&gt;|\|\-&gt;|=&gt;|\|=&gt;|==&gt;|\-\-&gt;|\~\~&gt;|\~&gt;|&gt;\-&gt;|\-&gt;&gt;|&lt;\-|&lt;==|&lt;\-\-|&lt;\~\~|&lt;\~|&lt;\-&lt;|&lt;&lt;\-|&lt;\-&gt;|&lt;=&gt;|&lt;==&gt;|&lt;\-\-&gt;|&gt;|&lt;|\~|:|\|)"><token type="Operator"/></rule>
<rule pattern="\\"><token type="Punctuation"/></rule>
<rule pattern="\\\$"><token type="Punctuation"/></rule>
<rule pattern="\$"><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="into_code"/></rule>
<rule pattern="([a-zA-Z][a-zA-Z0-9-]*)(\s*)(\()"><bygroups><token type="NameFunction"/><token type="TextWhitespace"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="([a-zA-Z][a-zA-Z0-9-]*)(:)"><bygroups><token type="NameVariable"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="([a-zA-Z][a-zA-Z0-9-]*)"><token type="NameVariable"/></rule>
<rule pattern="[0-9]+(\.[0-9]+)?"><token type="LiteralNumber"/></rule>
<rule pattern="\.{1,3}|\(|\)|,|\{|\}"><token type="Punctuation"/></rule>
<rule pattern="&quot;[^&quot;]*&quot;"><token type="LiteralStringDouble"/></rule>
<rule pattern="[ \t\n]+"><token type="TextWhitespace"/></rule>
</state>
<state name="comment">
<rule pattern="//.*$"><token type="CommentSingle"/></rule>
<rule pattern="/[*](.|\n)*?[*]/"><token type="CommentMultiline"/></rule>
</state>
<state name="code">
<rule><include state="comment"/></rule>
<rule pattern="\["><token type="Punctuation"/><push state="markup"/></rule>
<rule pattern="\(|\{"><token type="Punctuation"/><push state="code"/></rule>
<rule pattern="\)|\}"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="&quot;[^&quot;]*&quot;"><token type="LiteralStringDouble"/></rule>
<rule pattern=",|\.{1,2}"><token type="Punctuation"/></rule>
<rule pattern="="><token type="Operator"/></rule>
<rule pattern="(and|or|not)\b"><token type="OperatorWord"/></rule>
<rule pattern="=&gt;|&lt;=|==|!=|&gt;|&lt;|-=|\+=|\*=|/=|\+|-|\\|\*"><token type="Operator"/></rule>
<rule pattern="([a-zA-Z_][a-zA-Z0-9_-]*)(:)"><bygroups><token type="NameVariable"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="([a-zA-Z_][a-zA-Z0-9_-]*)(\()"><bygroups><token type="NameFunction"/><token type="Punctuation"/></bygroups><push state="code"/></rule>
<rule pattern="(as|break|export|continue|else|for|if|in|return|while)\b"><token type="KeywordReserved"/></rule>
<rule pattern="(import|include)\b"><token type="KeywordNamespace"/></rule>
<rule pattern="(auto|none|true|false)\b"><token type="KeywordConstant"/></rule>
<rule pattern="([0-9.]+)(mm|pt|cm|in|em|fr|%)"><bygroups><token type="LiteralNumber"/><token type="KeywordReserved"/></bygroups></rule>
<rule pattern="0x[0-9a-fA-F]+"><token type="LiteralNumberHex"/></rule>
<rule pattern="0b[01]+"><token type="LiteralNumberBin"/></rule>
<rule pattern="0o[0-7]+"><token type="LiteralNumberOct"/></rule>
<rule pattern="[0-9]+[\.e][0-9]+"><token type="LiteralNumberFloat"/></rule>
<rule pattern="[0-9]+"><token type="LiteralNumberInteger"/></rule>
<rule pattern="(let|set|show)\b"><token type="KeywordDeclaration"/></rule>
<rule pattern="([a-zA-Z_][a-zA-Z0-9_-]*)"><token type="NameVariable"/></rule>
<rule pattern="[ \t\n]+"><token type="TextWhitespace"/></rule>
<rule pattern=":"><token type="Punctuation"/></rule>
</state>
<state name="inline_code">
<rule pattern=";\b"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="\n"><token type="TextWhitespace"/><pop depth="1"/></rule>
<rule><include state="code"/></rule>
</state>
</rules>
</lexer>

View File

@ -1,283 +0,0 @@
<lexer>
<config>
<name>WebVTT</name>
<alias>vtt</alias>
<filename>*.vtt</filename>
<mime_type>text/vtt</mime_type>
</config>
<!--
The WebVTT spec refers to a WebVTT line terminator as either CRLF, CR or LF.
(https://www.w3.org/TR/webvtt1/#webvtt-line-terminator) However, with this
definition it is unclear whether CRLF is one line terminator (CRLF) or two
line terminators (CR and LF).
To work around this ambiguity, only CRLF and LF are considered as line terminators.
To my knowledge only classic Mac OS uses CR as line terminators, so the lexer should
still work for most files.
-->
<rules>
<!-- https://www.w3.org/TR/webvtt1/#webvtt-file-body -->
<state name="root">
<rule pattern="(\AWEBVTT)((?:[ \t][^\r\n]*)?(?:\r?\n){2,})">
<bygroups>
<token type="Keyword" />
<token type="Text" />
</bygroups>
</rule>
<rule pattern="(^REGION)([ \t]*$)">
<bygroups>
<token type="Keyword" />
<token type="Text" />
</bygroups>
<push state="region-settings-list" />
</rule>
<rule
pattern="(^STYLE)([ \t]*$)((?:(?!&#45;&#45;&gt;)[\s\S])*?)((?:\r?\n){2})">
<bygroups>
<token type="Keyword" />
<token type="Text" />
<using lexer="CSS" />
<token type="Text" />
</bygroups>
</rule>
<rule>
<include state="comment" />
</rule>
<rule
pattern="(?=((?![^\r\n]*&#45;&#45;&gt;)[^\r\n]*\r?\n)?(\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3}[ \t]+&#45;&#45;&gt;[ \t]+(\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})"
>
<push state="cues" />
</rule>
</state>
<!-- https://www.w3.org/TR/webvtt1/#webvtt-region-settings-list -->
<state name="region-settings-list">
<rule pattern="(?: |\t|\r?\n(?!\r?\n))+">
<token type="Text" />
</rule>
<rule pattern="(?:\r?\n){2}">
<token type="Text" />
<pop depth="1" />
</rule>
<rule pattern="(id)(:)(?!&#45;&#45;&gt;)(\S+)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
</bygroups>
</rule>
<rule pattern="(width)(:)((?:[1-9]?\d|100)(?:\.\d+)?)(%)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
</bygroups>
</rule>
<rule pattern="(lines)(:)(\d+)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
</bygroups>
</rule>
<rule
pattern="(regionanchor|viewportanchor)(:)((?:[1-9]?\d|100)(?:\.\d+)?)(%)(,)((?:[1-9]?\d|100)(?:\.\d+)?)(%)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
</bygroups>
</rule>
<rule pattern="(scroll)(:)(up)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
</state>
<!-- https://www.w3.org/TR/webvtt1/#webvtt-comment-block -->
<state name="comment">
<rule
pattern="^NOTE( |\t|\r?\n)((?!&#45;&#45;&gt;)[\s\S])*?(?:(\r?\n){2}|\Z)">
<token type="Comment" />
</rule>
</state>
<!--
"Zero or more WebVTT cue blocks and WebVTT comment blocks separated from each other by one or more
WebVTT line terminators." (https://www.w3.org/TR/webvtt1/#file-structure)
-->
<state name="cues">
<rule
pattern="(?:((?!&#45;&#45;&gt;)[^\r\n]+)?(\r?\n))?((?:\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})([ \t]+)(&#45;&#45;&gt;)([ \t]+)((?:\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})([ \t]*)">
<bygroups>
<token type="Name" />
<token type="Text" />
<token type="LiteralDate" />
<token type="Text" />
<token type="Operator" />
<token type="Text" />
<token type="LiteralDate" />
<token type="Text" />
</bygroups>
<push state="cue-settings-list" />
</rule>
<rule>
<include state="comment" />
</rule>
</state>
<!-- https://www.w3.org/TR/webvtt1/#webvtt-cue-settings-list -->
<state name="cue-settings-list">
<rule pattern="[ \t]+">
<token type="Text" />
</rule>
<rule pattern="(vertical)(:)?(rl|lr)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
<rule
pattern="(line)(:)?(?:(?:((?:[1-9]?\d|100)(?:\.\d+)?)(%)|(-?\d+))(?:(,)(start|center|end))?)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
<token type="Literal" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
<rule
pattern="(position)(:)?(?:(?:((?:[1-9]?\d|100)(?:\.\d+)?)(%)|(-?\d+))(?:(,)(line-left|center|line-right))?)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
<token type="Literal" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
<rule pattern="(size)(:)?(?:((?:[1-9]?\d|100)(?:\.\d+)?)(%))?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
</bygroups>
</rule>
<rule pattern="(align)(:)?(start|center|end|left|right)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
<rule pattern="(region)(:)?((?![^\r\n]*&#45;&#45;&gt;(?=[ \t]+?))[^ \t\r\n]+)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
</bygroups>
</rule>
<rule
pattern="(?=\r?\n)">
<push state="cue-payload" />
</rule>
</state>
<!-- https://www.w3.org/TR/webvtt1/#cue-payload -->
<state name="cue-payload">
<rule pattern="(\r?\n){2,}">
<token type="Text" />
<pop depth="2" />
</rule>
<rule pattern="[^&lt;&amp;]+?">
<token type="Text" />
</rule>
<rule pattern="&amp;(#\d+|#x[0-9A-Fa-f]+|[a-zA-Z0-9]+);">
<token type="Text" />
</rule>
<rule pattern="(?=&lt;)">
<token type="Text" />
<push state="cue-span-tag" />
</rule>
</state>
<state name="cue-span-tag">
<rule
pattern="&lt;(?=c|i|b|u|ruby|rt|v|lang|(?:\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})">
<token type="Punctuation" />
<push state="cue-span-start-tag-name" />
</rule>
<rule pattern="(&lt;/)(c|i|b|u|ruby|rt|v|lang)">
<bygroups>
<token type="Punctuation" />
<token type="NameTag" />
</bygroups>
</rule>
<rule pattern="&gt;">
<token type="Punctuation" />
<pop depth="1" />
</rule>
</state>
<state name="cue-span-start-tag-name">
<rule pattern="(c|i|b|u|ruby|rt)|((?:\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})">
<bygroups>
<token type="NameTag" />
<token type="LiteralDate" />
</bygroups>
<push state="cue-span-classes-without-annotations" />
</rule>
<rule pattern="v|lang">
<token type="NameTag" />
<push state="cue-span-classes-with-annotations" />
</rule>
</state>
<state name="cue-span-classes-without-annotations">
<rule>
<include state="cue-span-classes" />
</rule>
<rule pattern="(?=&gt;)">
<pop depth="2" />
</rule>
</state>
<state name="cue-span-classes-with-annotations">
<rule>
<include state="cue-span-classes" />
</rule>
<rule pattern="(?=[ \t])">
<push state="cue-span-start-tag-annotations" />
</rule>
</state>
<state name="cue-span-classes">
<rule pattern="(\.)([^ \t\n\r&amp;&lt;&gt;\.]+)">
<bygroups>
<token type="Punctuation" />
<token type="NameTag" />
</bygroups>
</rule>
</state>
<state name="cue-span-start-tag-annotations">
<rule
pattern="[ \t](?:[^\n\r&amp;&gt;]|&amp;(?:#\d+|#x[0-9A-Fa-f]+|[a-zA-Z0-9]+);)+">
<token type="Text" />
</rule>
<rule pattern="(?=&gt;)">
<token type="Text" />
<pop depth="3" />
</rule>
</state>
</rules>
</lexer>

View File

@ -53,7 +53,7 @@
<bygroups>
<token type="Punctuation"/>
<token type="LiteralStringDoc"/>
<token type="Ignore"/>
<token type="TextWhitespace"/>
</bygroups>
</rule>
<rule pattern="(false|False|FALSE|true|True|TRUE|null|Off|off|yes|Yes|YES|OFF|On|ON|no|No|on|NO|n|N|Y|y)\b">

View File

@ -38,12 +38,6 @@ for fname in glob.glob("lexers/*.xml"):
lexer_by_filename[filename].add(lexer_name)
with open("src/constants/lexers.cr", "w") as f:
# Crystal doesn't come from a xml file
lexer_by_name["crystal"] = "crystal"
lexer_by_name["cr"] = "crystal"
lexer_by_filename["*.cr"] = ["crystal"]
lexer_by_mimetype["text/x-crystal"] = "crystal"
f.write("module Tartrazine\n")
f.write(" LEXERS_BY_NAME = {\n")
for k in sorted(lexer_by_name.keys()):

View File

@ -1,5 +1,5 @@
name: tartrazine
version: 0.12.0
version: 0.7.0
authors:
- Roberto Alsina <roberto.alsina@gmail.com>
@ -18,10 +18,6 @@ dependencies:
github: ralsina/sixteen
docopt:
github: chenkovsky/docopt.cr
stumpy_utils:
github: stumpycr/stumpy_utils
stumpy_png:
github: stumpycr/stumpy_png
crystal: ">= 1.13.0"

View File

@ -1 +1 @@
.e {color: #aa0000;background-color: #ffaaaa;}.b {background-color: #f0f3f3;tab-size: 8;}.k {color: #006699;font-weight: 600;}.kp {}.kt {color: #007788;}.na {color: #330099;}.nb {color: #336666;}.nc {color: #00aa88;font-weight: 600;}.nc {color: #336600;}.nd {color: #9999ff;}.ne {color: #999999;font-weight: 600;}.ne {color: #cc0000;font-weight: 600;}.nf {color: #cc00ff;}.nl {color: #9999ff;}.nn {color: #00ccff;font-weight: 600;}.nt {color: #330099;font-weight: 600;}.nv {color: #003333;}.ls {color: #cc3300;}.lsd {font-style: italic;}.lse {color: #cc3300;font-weight: 600;}.lsi {color: #aa0000;}.lso {color: #cc3300;}.lsr {color: #33aaaa;}.lss {color: #ffcc33;}.ln {color: #ff6600;}.o {color: #555555;}.ow {color: #000000;font-weight: 600;}.c {color: #0099ff;font-style: italic;}.cs {font-weight: 600;}.cp {color: #009999;font-style: normal;}.gd {background-color: #ffcccc;border: 1px solid #cc0000;}.ge {font-style: italic;}.ge {color: #ff0000;}.gh {color: #003300;font-weight: 600;}.gi {background-color: #ccffcc;border: 1px solid #00cc00;}.go {color: #aaaaaa;}.gp {color: #000099;font-weight: 600;}.gs {font-weight: 600;}.gs {color: #003300;font-weight: 600;}.gt {color: #99cc66;}.gu {text-decoration: underline;}.tw {color: #bbbbbb;}.lh {}
.e {color: #aa0000;background-color: #ffaaaa;}.b {background-color: #f0f3f3;tab-size: 8;}.k {color: #006699;font-weight: bold;}.kp {font-weight: 600;}.kt {color: #007788;}.na {color: #330099;}.nb {color: #336666;}.nc {color: #00aa88;font-weight: bold;}.nc {color: #336600;}.nd {color: #9999ff;}.ne {color: #999999;font-weight: bold;}.ne {color: #cc0000;font-weight: bold;}.nf {color: #cc00ff;}.nl {color: #9999ff;}.nn {color: #00ccff;font-weight: bold;}.nt {color: #330099;font-weight: bold;}.nv {color: #003333;}.ls {color: #cc3300;}.lsd {font-style: italic;}.lse {color: #cc3300;font-weight: bold;}.lsi {color: #aa0000;}.lso {color: #cc3300;}.lsr {color: #33aaaa;}.lss {color: #ffcc33;}.ln {color: #ff6600;}.o {color: #555555;}.ow {color: #000000;font-weight: bold;}.c {color: #0099ff;font-style: italic;}.cs {font-weight: bold;}.cp {color: #009999;font-style: normal;}.gd {background-color: #ffcccc;border: 1px solid #cc0000;}.ge {font-style: italic;}.ge {color: #ff0000;}.gh {color: #003300;font-weight: bold;}.gi {background-color: #ccffcc;border: 1px solid #00cc00;}.go {color: #aaaaaa;}.gp {color: #000099;font-weight: bold;}.gs {font-weight: bold;}.gs {color: #003300;font-weight: bold;}.gt {color: #99cc66;}.gu {text-decoration: underline;}.tw {color: #bbbbbb;}.lh {}

View File

@ -1 +1 @@
.b {color: #b7b7b7;background-color: #101010;font-weight: 600;tab-size: 8;}.lh {color: #8eaaaa;background-color: #232323;}.t {color: #b7b7b7;}.e {color: #de6e6e;}.c {color: #333333;}.cp {color: #876c4f;}.cpf {color: #5f8787;}.k {color: #d69094;}.kt {color: #de6e6e;}.na {color: #8eaaaa;}.nb {color: #de6e6e;}.nbp {color: #de6e6e;}.nc {color: #8eaaaa;}.nc {color: #dab083;}.nd {color: #dab083;}.nf {color: #8eaaaa;}.nn {color: #8eaaaa;}.nt {color: #d69094;}.nv {color: #8eaaaa;}.nvi {color: #de6e6e;}.ln {color: #dab083;}.o {color: #60a592;}.ow {color: #d69094;}.l {color: #5f8787;}.ls {color: #5f8787;}.lsi {color: #876c4f;}.lsr {color: #60a592;}.lss {color: #dab083;}
.b {color: #b7b7b7;background-color: #101010;font-weight: bold;tab-size: 8;}.lh {color: #8eaaaa;background-color: #232323;}.t {color: #b7b7b7;}.e {color: #de6e6e;}.c {color: #333333;}.cp {color: #876c4f;}.cpf {color: #5f8787;}.k {color: #d69094;}.kt {color: #de6e6e;}.na {color: #8eaaaa;}.nb {color: #de6e6e;}.nbp {color: #de6e6e;}.nc {color: #8eaaaa;}.nc {color: #dab083;}.nd {color: #dab083;}.nf {color: #8eaaaa;}.nn {color: #8eaaaa;}.nt {color: #d69094;}.nv {color: #8eaaaa;}.nvi {color: #de6e6e;}.ln {color: #dab083;}.o {color: #60a592;}.ow {color: #d69094;}.l {color: #5f8787;}.ls {color: #5f8787;}.lsi {color: #876c4f;}.lsr {color: #60a592;}.lss {color: #dab083;}

View File

@ -1,5 +1,4 @@
require "./spec_helper"
require "digest/sha1"
# These are the testcases from Pygments
testcases = Dir.glob("#{__DIR__}/tests/**/*txt").sort
@ -47,13 +46,8 @@ known_bad = {
"#{__DIR__}/tests/bash_session/test_newline_in_ls_no_ps2.txt",
"#{__DIR__}/tests/bash_session/test_newline_in_ls_ps2.txt",
"#{__DIR__}/tests/bash_session/test_virtualenv.txt",
"#{__DIR__}/tests/mcfunction/commenting.txt",
"#{__DIR__}/tests/mcfunction/coordinates.txt",
"#{__DIR__}/tests/mcfunction/data.txt",
"#{__DIR__}/tests/mcfunction/difficult_1.txt",
"#{__DIR__}/tests/mcfunction/multiline.txt",
"#{__DIR__}/tests/mcfunction/selectors.txt",
"#{__DIR__}/tests/mcfunction/simple.txt",
}
# Tests that fail because of a limitation in PCRE2
@ -109,7 +103,6 @@ describe Tartrazine do
)
end
end
describe "to_ansi" do
it "should do basic highlighting" do
ansi = Tartrazine.to_ansi("puts 'Hello, World!'", "ruby")
@ -121,29 +114,11 @@ describe Tartrazine do
)
else
ansi.should eq(
"\e[38;2;171;70;66mputs\e[0m\e[38;2;216;216;216m \e[0m\e[38;2;161;181;108m'Hello, World!'\e[0m"
"\e[38;2;171;70;66mputs\e[0m\e[38;2;216;216;216m \e[0m'Hello, World!'"
)
end
end
end
describe "to_svg" do
it "should do basic highlighting" do
svg = Tartrazine.to_svg("puts 'Hello, World!'", "ruby", standalone: false)
svg.should eq(
"<text x=\"0\" y=\"19\" xml:space=\"preserve\"><tspan fill=\"#ab4642\">puts</tspan><tspan fill=\"#d8d8d8\"> </tspan><tspan fill=\"#a1b56c\">&#39;Hello, World!&#39;</tspan></text>"
)
end
end
describe "to_png" do
it "should do basic highlighting" do
png = Digest::SHA1.hexdigest(Tartrazine.to_png("puts 'Hello, World!'", "ruby"))
png.should eq(
"62d419dcd263fffffc265a0f04c156dc2530c362"
)
end
end
end
# Helper that creates lexer and tokenizes

View File

@ -471,7 +471,7 @@ module Tartrazine
"application/x-fennel" => "fennel",
"application/x-fish" => "fish",
"application/x-forth" => "forth",
"application/x-gdscript" => "gdscript3",
"application/x-gdscript" => "gdscript",
"application/x-hcl" => "hcl",
"application/x-hy" => "hy",
"application/x-javascript" => "javascript",
@ -594,7 +594,7 @@ module Tartrazine
"text/x-fortran" => "fortran",
"text/x-fsharp" => "fsharp",
"text/x-gas" => "gas",
"text/x-gdscript" => "gdscript3",
"text/x-gdscript" => "gdscript",
"text/x-gherkin" => "gherkin",
"text/x-gleam" => "gleam",
"text/x-glslsrc" => "glsl",

View File

@ -34,6 +34,8 @@ module Tartrazine
end
def colorize(text : String, token : String) : String
style = theme.styles.fetch(token, nil)
return text if style.nil?
if theme.styles.has_key?(token)
s = theme.styles[token]
else

View File

@ -1,117 +0,0 @@
require "../formatter"
require "compress/gzip"
require "digest/sha1"
require "stumpy_png"
require "stumpy_utils"
module Tartrazine
def self.to_png(text : String, language : String,
theme : String = "default-dark",
line_numbers : Bool = false) : String
buf = IO::Memory.new
Tartrazine::Png.new(
theme: Tartrazine.theme(theme),
line_numbers: line_numbers
).format(text, Tartrazine.lexer(name: language), buf)
buf.to_s
end
class FontFiles
extend BakedFileSystem
bake_folder "../../fonts", __DIR__
end
class Png < Formatter
include StumpyPNG
property? line_numbers : Bool = false
@font_regular : PCFParser::Font
@font_bold : PCFParser::Font
@font_oblique : PCFParser::Font
@font_bold_oblique : PCFParser::Font
@font_width = 15
@font_height = 24
def initialize(@theme : Theme = Tartrazine.theme("default-dark"), @line_numbers : Bool = false)
@font_regular = load_font("/courier-regular.pcf.gz")
@font_bold = load_font("/courier-bold.pcf.gz")
@font_oblique = load_font("/courier-oblique.pcf.gz")
@font_bold_oblique = load_font("/courier-bold-oblique.pcf.gz")
end
private def load_font(name : String) : PCFParser::Font
compressed = FontFiles.get(name)
uncompressed = Compress::Gzip::Reader.open(compressed) do |gzip|
gzip.gets_to_end
end
PCFParser::Font.new(IO::Memory.new uncompressed)
end
private def line_label(i : Int32) : String
"#{i + 1}".rjust(4).ljust(5)
end
def format(text : String, lexer : BaseLexer, outp : IO) : Nil
# Create canvas of correct size
lines = text.split("\n")
canvas_height = lines.size * @font_height
canvas_width = lines.max_of(&.size)
canvas_width += 5 if line_numbers?
canvas_width *= @font_width
bg_color = RGBA.from_hex("##{theme.styles["Background"].background.try &.hex}")
canvas = Canvas.new(canvas_width, canvas_height, bg_color)
tokenizer = lexer.tokenizer(text)
x = 0
y = @font_height
i = 0
if line_numbers?
canvas.text(x, y, line_label(i), @font_regular, RGBA.from_hex("##{theme.styles["Background"].color.try &.hex}"))
x += 5 * @font_width
end
tokenizer.each do |token|
font, color = token_style(token[:type])
# These fonts are very limited
t = token[:value].gsub(/[^[:ascii:]]/, "?")
canvas.text(x, y, t.rstrip("\n"), font, color)
if token[:value].includes?("\n")
x = 0
y += @font_height
i += 1
if line_numbers?
canvas.text(x, y, line_label(i), @font_regular, RGBA.from_hex("##{theme.styles["Background"].color.try &.hex}"))
x += 4 * @font_width
end
end
x += token[:value].size * @font_width
end
StumpyPNG.write(canvas, outp)
end
def token_style(token : String) : {PCFParser::Font, RGBA}
if theme.styles.has_key?(token)
s = theme.styles[token]
else
# Themes don't contain information for each specific
# token type. However, they may contain information
# for a parent style. Worst case, we go to the root
# (Background) style.
s = theme.styles[theme.style_parents(token).reverse.find { |parent|
theme.styles.has_key?(parent)
}]
end
color = RGBA.from_hex("##{theme.styles["Background"].color.try &.hex}")
color = RGBA.from_hex("##{s.color.try &.hex}") if s.color
return {@font_bold_oblique, color} if s.bold && s.italic
return {@font_bold, color} if s.bold
return {@font_oblique, color} if s.italic
return {@font_regular, color}
end
end
end

View File

@ -6,21 +6,11 @@ require "crystal/syntax_highlighter"
module Tartrazine
class LexerFiles
extend BakedFileSystem
macro bake_selected_lexers
{% for lexer in env("TT_LEXERS").split "," %}
bake_file {{ lexer }}+".xml", {{ read_file "#{__DIR__}/../lexers/" + lexer + ".xml" }}
{% end %}
end
{% if flag?(:nolexers) %}
bake_selected_lexers
{% else %}
bake_folder "../lexers", __DIR__
{% end %}
bake_folder "../lexers", __DIR__
end
# Get the lexer object for a language name
# FIXME: support mimetypes
def self.lexer(name : String? = nil, filename : String? = nil, mimetype : String? = nil) : BaseLexer
return lexer_by_name(name) if name && name != "autodetect"
return lexer_by_filename(filename) if filename
@ -43,8 +33,6 @@ module Tartrazine
raise Exception.new("Unknown lexer: #{name}") if lexer_file_name.nil?
RegexLexer.from_xml(LexerFiles.get("/#{lexer_file_name}.xml").gets_to_end)
rescue ex : BakedFileSystem::NoSuchFileError
raise Exception.new("Unknown lexer: #{name}")
end
private def self.lexer_by_filename(filename : String) : BaseLexer
@ -96,8 +84,7 @@ module Tartrazine
# Return a list of all lexers
def self.lexers : Array(String)
file_map = LexerFiles.files.map(&.path)
LEXERS_BY_NAME.keys.select { |k| file_map.includes?("/#{k}.xml") }.sort!
LEXERS_BY_NAME.keys.sort!
end
# A token, the output of the tokenizer

View File

@ -4,10 +4,6 @@ require "./tartrazine"
HELP = <<-HELP
tartrazine: a syntax highlighting tool
You can use the CLI to generate HTML, terminal, JSON or SVG output
from a source file using different themes.
Keep in mind that not all formatters support all features.
Usage:
tartrazine (-h, --help)
tartrazine FILE -f html [-t theme][--standalone][--line-numbers]
@ -17,8 +13,6 @@ Usage:
[-o output]
tartrazine FILE -f svg [-t theme][--standalone][--line-numbers]
[-l lexer][-o output]
tartrazine FILE -f png [-t theme][--line-numbers]
[-l lexer][-o output]
tartrazine FILE -f json [-o output]
tartrazine --list-themes
tartrazine --list-lexers
@ -84,10 +78,6 @@ if options["-f"]
formatter.standalone = options["--standalone"] != nil
formatter.line_numbers = options["--line-numbers"] != nil
formatter.theme = theme
when "png"
formatter = Tartrazine::Png.new
formatter.line_numbers = options["--line-numbers"] != nil
formatter.theme = theme
else
puts "Invalid formatter: #{formatter}"
exit 1

View File

@ -11,20 +11,7 @@ module Tartrazine
struct ThemeFiles
extend BakedFileSystem
macro bake_selected_themes
{% if env("TT_THEMES") %}
{% for theme in env("TT_THEMES").split "," %}
bake_file {{ theme }}+".xml", {{ read_file "#{__DIR__}/../styles/" + theme + ".xml" }}
{% end %}
{% end %}
end
{% if flag?(:nothemes) %}
bake_selected_themes
{% else %}
bake_folder "../styles", __DIR__
{% end %}
bake_folder "../styles", __DIR__
end
def self.theme(name : String) : Theme

View File

@ -1,39 +1,44 @@
<style name="github">
<entry type="Error" style="#f6f8fa bg:#82071e"/>
<entry type="Error" style="#a61717 bg:#e3d2d2"/>
<entry type="Background" style="bg:#ffffff"/>
<entry type="Keyword" style="#cf222e"/>
<entry type="KeywordType" style="#cf222e"/>
<entry type="NameAttribute" style="#1f2328"/>
<entry type="NameBuiltin" style="#6639ba"/>
<entry type="NameBuiltinPseudo" style="#6a737d"/>
<entry type="NameClass" style="#1f2328"/>
<entry type="NameConstant" style="#0550ae"/>
<entry type="NameDecorator" style="#0550ae"/>
<entry type="NameEntity" style="#6639ba"/>
<entry type="NameFunction" style="#6639ba"/>
<entry type="Keyword" style="bold #000000"/>
<entry type="KeywordType" style="bold #445588"/>
<entry type="NameAttribute" style="#008080"/>
<entry type="NameBuiltin" style="#0086b3"/>
<entry type="NameBuiltinPseudo" style="#999999"/>
<entry type="NameClass" style="bold #445588"/>
<entry type="NameConstant" style="#008080"/>
<entry type="NameDecorator" style="bold #3c5d5d"/>
<entry type="NameEntity" style="#800080"/>
<entry type="NameException" style="bold #990000"/>
<entry type="NameFunction" style="bold #990000"/>
<entry type="NameLabel" style="bold #990000"/>
<entry type="NameNamespace" style="#24292e"/>
<entry type="NameOther" style="#1f2328"/>
<entry type="NameTag" style="#0550ae"/>
<entry type="NameVariable" style="#953800"/>
<entry type="NameVariableClass" style="#953800"/>
<entry type="NameVariableGlobal" style="#953800"/>
<entry type="NameVariableInstance" style="#953800"/>
<entry type="LiteralString" style="#0a3069"/>
<entry type="LiteralStringRegex" style="#0a3069"/>
<entry type="LiteralStringSymbol" style="#032f62"/>
<entry type="LiteralNumber" style="#0550ae"/>
<entry type="Operator" style="#0550ae"/>
<entry type="Comment" style="#57606a"/>
<entry type="CommentMultiline" style="#57606a"/>
<entry type="CommentSingle" style="#57606a"/>
<entry type="CommentSpecial" style="#57606a"/>
<entry type="CommentPreproc" style="#57606a"/>
<entry type="GenericDeleted" style="#82071e bg:#ffebe9"/>
<entry type="GenericEmph" style="#1f2328"/>
<entry type="GenericInserted" style="#116329 bg:#dafbe1"/>
<entry type="GenericOutput" style="#1f2328"/>
<entry type="NameNamespace" style="#555555"/>
<entry type="NameTag" style="#000080"/>
<entry type="NameVariable" style="#008080"/>
<entry type="NameVariableClass" style="#008080"/>
<entry type="NameVariableGlobal" style="#008080"/>
<entry type="NameVariableInstance" style="#008080"/>
<entry type="LiteralString" style="#dd1144"/>
<entry type="LiteralStringRegex" style="#009926"/>
<entry type="LiteralStringSymbol" style="#990073"/>
<entry type="LiteralNumber" style="#009999"/>
<entry type="Operator" style="bold #000000"/>
<entry type="Comment" style="italic #999988"/>
<entry type="CommentMultiline" style="italic #999988"/>
<entry type="CommentSingle" style="italic #999988"/>
<entry type="CommentSpecial" style="bold italic #999999"/>
<entry type="CommentPreproc" style="bold #999999"/>
<entry type="GenericDeleted" style="#000000 bg:#ffdddd"/>
<entry type="GenericEmph" style="italic #000000"/>
<entry type="GenericError" style="#aa0000"/>
<entry type="GenericHeading" style="#999999"/>
<entry type="GenericInserted" style="#000000 bg:#ddffdd"/>
<entry type="GenericOutput" style="#888888"/>
<entry type="GenericPrompt" style="#555555"/>
<entry type="GenericStrong" style="bold"/>
<entry type="GenericSubheading" style="#aaaaaa"/>
<entry type="GenericTraceback" style="#aa0000"/>
<entry type="GenericUnderline" style="underline"/>
<entry type="Punctuation" style="#1f2328"/>
<entry type="TextWhitespace" style="#ffffff"/>
<entry type="TextWhitespace" style="#bbbbbb"/>
</style>

View File

@ -1,10 +0,0 @@
<style name="onesenterprise">
<entry type="Keyword" style="#ff0000"/>
<entry type="Name" style="#0000ff"/>
<entry type="LiteralString" style="#000000"/>
<entry type="Operator" style="#ff0000"/>
<entry type="Punctuation" style="#ff0000"/>
<entry type="Comment" style="#008000"/>
<entry type="CommentPreproc" style="#963200"/>
<entry type="Text" style="#000000"/>
</style>

View File

@ -1,42 +0,0 @@
<style name="pygments">
<entry type="Error" style="border:#ff0000"/>
<entry type="Keyword" style="bold #008000"/>
<entry type="KeywordPseudo" style="nobold"/>
<entry type="KeywordType" style="nobold #b00040"/>
<entry type="NameAttribute" style="#7d9029"/>
<entry type="NameBuiltin" style="#008000"/>
<entry type="NameClass" style="bold #0000ff"/>
<entry type="NameConstant" style="#880000"/>
<entry type="NameDecorator" style="#aa22ff"/>
<entry type="NameEntity" style="bold #999999"/>
<entry type="NameException" style="bold #d2413a"/>
<entry type="NameFunction" style="#0000ff"/>
<entry type="NameLabel" style="#a0a000"/>
<entry type="NameNamespace" style="bold #0000ff"/>
<entry type="NameTag" style="bold #008000"/>
<entry type="NameVariable" style="#19177c"/>
<entry type="LiteralString" style="#ba2121"/>
<entry type="LiteralStringDoc" style="italic"/>
<entry type="LiteralStringEscape" style="bold #bb6622"/>
<entry type="LiteralStringInterpol" style="bold #bb6688"/>
<entry type="LiteralStringOther" style="#008000"/>
<entry type="LiteralStringRegex" style="#bb6688"/>
<entry type="LiteralStringSymbol" style="#19177c"/>
<entry type="LiteralNumber" style="#666666"/>
<entry type="Operator" style="#666666"/>
<entry type="OperatorWord" style="bold #aa22ff"/>
<entry type="Comment" style="italic #408080"/>
<entry type="CommentPreproc" style="noitalic #bc7a00"/>
<entry type="GenericDeleted" style="#a00000"/>
<entry type="GenericEmph" style="italic"/>
<entry type="GenericError" style="#ff0000"/>
<entry type="GenericHeading" style="bold #000080"/>
<entry type="GenericInserted" style="#00a000"/>
<entry type="GenericOutput" style="#888888"/>
<entry type="GenericPrompt" style="bold #000080"/>
<entry type="GenericStrong" style="bold"/>
<entry type="GenericSubheading" style="bold #800080"/>
<entry type="GenericTraceback" style="#0044dd"/>
<entry type="GenericUnderline" style="underline"/>
<entry type="TextWhitespace" style="#bbbbbb"/>
</style>