27 Commits

Author SHA1 Message Date
d1ce83a9c8 fix: Don't log when falling back to ruby, it breaks stuff
Some checks failed
Tests / build (push) Has been cancelled
Coverage / build (push) Has been cancelled
2025-03-09 18:38:42 -03:00
aabe028767 feat: Support custom template for HTML standalone output 2025-02-21 19:43:09 -03:00
ed61a84553 fix: when the internal crystal highlighter fails, fallback to ruby. Fixes #13 2025-02-20 13:24:53 -03:00
b7e4aaa1f9 chore: typo 2025-02-19 09:38:12 -03:00
0f6b9b0117 fix: better error message when loading a XML theme 2025-02-18 21:45:26 -03:00
b81e9c4405 chore: upgrade ci image 2025-01-21 15:11:06 -03:00
9c85c6cf18 bump: Release v0.12.0 2025-01-21 14:37:49 -03:00
d4e189e596 chore: mark more mcfunction tests as bad 2025-01-21 14:37:10 -03:00
71fb699f96 chore: Pin ubuntu version in CI 2025-01-21 14:37:03 -03:00
5fe309f24c feat: Bumped to latest chroma release 2025-01-21 12:31:39 -03:00
62db71ae4d build: automate AUR release
Some checks failed
Tests / build (push) Has been cancelled
Coverage / build (push) Has been cancelled
2024-10-14 17:27:31 -03:00
fff6cad5ac bump: Release v0.11.1 2024-10-14 16:56:17 -03:00
44e6af8546 fix: support choosing lexers when used as a library 2024-10-14 16:45:50 -03:00
9e2585a875 bump: Release v0.11.0 2024-10-14 13:28:47 -03:00
c16b139fa3 feat: support selecting only some themes
Some checks are pending
Tests / build (push) Waiting to run
2024-10-14 13:11:22 -03:00
e11775040c chore(build): strip static binary
Some checks failed
Tests / build (push) Has been cancelled
2024-09-26 20:50:14 -03:00
30bc8cccba chore: build before tag 2024-09-26 20:47:58 -03:00
1638c253cb bump: Release v0.10.0 2024-09-26 20:35:51 -03:00
c374f52aee Merge pull request #9 from ralsina/conditional-lexers-and-themes
Conditional lexers and themes
2024-09-26 20:34:24 -03:00
96fd9bdfe9 fix: Fix metadata to show crystal 2024-09-26 18:47:47 -03:00
0423811c5d feat: optional conditional baking of lexers 2024-09-26 18:47:47 -03:00
3d9d3ab5cf fix: strip binaries for release artifacts
Some checks failed
Tests / build (push) Has been cancelled
2024-09-21 21:28:13 -03:00
92a97490f1 bump: Release v0.9.1 2024-09-21 21:08:41 -03:00
22decedf3a test: added minimal tests for svg and png formatters
Some checks are pending
Tests / build (push) Waiting to run
2024-09-21 21:08:03 -03:00
8b34a1659d fix: Bug in high-level API for png formatter 2024-09-21 21:07:44 -03:00
3bf8172b89 fix: Terminal formatter was skipping things that it could highlight 2024-09-21 20:57:24 -03:00
4432da2893 bump: Release v0.9.0 2024-09-21 20:33:24 -03:00
40 changed files with 1521 additions and 249 deletions

View File

@ -9,7 +9,7 @@ permissions:
contents: read contents: read
jobs: jobs:
build: build:
runs-on: ubuntu-latest runs-on: ubuntu-22.04
steps: steps:
- name: Download source - name: Download source
uses: actions/checkout@v4 uses: actions/checkout@v4

View File

@ -7,7 +7,7 @@ permissions:
contents: read contents: read
jobs: jobs:
build: build:
runs-on: ubuntu-latest runs-on: ubuntu-22.04
steps: steps:
- name: Download source - name: Download source
uses: actions/checkout@v4 uses: actions/checkout@v4
@ -15,7 +15,7 @@ jobs:
uses: crystal-lang/install-crystal@v1 uses: crystal-lang/install-crystal@v1
- name: Run tests using kcov - name: Run tests using kcov
run: | run: |
sudo apt update && sudo apt install kcov sudo apt update && sudo apt upgrade && sudo apt install -y kcov
wget https://github.com/alecthomas/chroma/releases/download/v2.14.0/chroma-2.14.0-linux-amd64.tar.gz wget https://github.com/alecthomas/chroma/releases/download/v2.14.0/chroma-2.14.0-linux-amd64.tar.gz
tar xzvf chroma-2.14.0*gz tar xzvf chroma-2.14.0*gz
mkdir ~/.local/bin -p mkdir ~/.local/bin -p

3
.gitignore vendored
View File

@ -12,3 +12,6 @@ venv/
.croupier .croupier
coverage/ coverage/
run_tests run_tests
# We use the internal crystal lexer
lexers/crystal.xml

View File

@ -2,6 +2,67 @@
All notable changes to this project will be documented in this file. All notable changes to this project will be documented in this file.
## [0.12.0] - 2025-01-21
### 🚀 Features
- Bumped to latest chroma release
### ⚙️ Miscellaneous Tasks
- Pin ubuntu version in CI
- Mark more mcfunction tests as bad
### Build
- Automate AUR release
## [0.11.1] - 2024-10-14
### 🐛 Bug Fixes
- Support choosing lexers when used as a library
## [0.11.0] - 2024-10-14
### 🚀 Features
- Support selecting only some themes
## [0.10.0] - 2024-09-26
### 🚀 Features
- Optional conditional baking of lexers
### 🐛 Bug Fixes
- Strip binaries for release artifacts
- Fix metadata to show crystal
## [0.9.1] - 2024-09-22
### 🐛 Bug Fixes
- Terminal formatter was skipping things that it could highlight
- Bug in high-level API for png formatter
### 🧪 Testing
- Added minimal tests for svg and png formatters
## [0.9.0] - 2024-09-21
### 🚀 Features
- PNG writer based on Stumpy libs
### ⚙️ Miscellaneous Tasks
- Clean
- Detect version bump in release script
- Improve changelog handling
## [0.8.0] - 2024-09-21 ## [0.8.0] - 2024-09-21
### 🚀 Features ### 🚀 Features

View File

@ -121,3 +121,17 @@ tasks:
- src - src
commands: | commands: |
tokei src -e src/constants/ tokei src -e src/constants/
aur:
phony: true
always_run: true
commands: |
rm -rf aur-{{NAME}}
git clone ssh://aur@aur.archlinux.org/{{NAME}}.git aur-{{NAME}}
sed s/pkgver=.*/pkgver=$(shards version)/ -i aur-{{NAME}}/PKGBUILD
sed s/pkgrel=.*/pkgrel=1/ -i aur-{{NAME}}/PKGBUILD
cd aur-{{NAME}} && updpkgsums && makepkg --printsrcinfo > .SRCINFO
cd aur-{{NAME}} && makepkg -fsr
cd aur-{{NAME}} && git add PKGBUILD .SRCINFO
cd aur-{{NAME}} && git commit -a -m "Update to $(shards version)"
cd aur-{{NAME}} && git push

View File

@ -82,6 +82,78 @@ puts formatter.format("puts \"Hello, world!\"", lexer)
The reason you may want to use the manual version is to reuse The reason you may want to use the manual version is to reuse
the lexer and formatter objects for performance reasons. the lexer and formatter objects for performance reasons.
## Choosing what Lexers you want
By default Tartrazine will support all its lexers by embedding
them in the binary. This makes the binary large. If you are
using it as a library, you may want to just include a selection of lexers. To do that:
* Pass the `-Dnolexers` flag to the compiler
* Set the `TT_LEXERS` environment variable to a
comma-separated list of lexers you want to include.
This builds a binary with only the python, markdown, bash and yaml lexers (enough to highlight this `README.md`):
```bash
> TT_LEXERS=python,markdown,bash,yaml shards build -Dnolexers -d --error-trace
Dependencies are satisfied
Building: tartrazine
```
## Choosing what themes you want
Themes come from two places, tartrazine itself and [Sixteen](https://github.com/ralsina/sixteen).
To only embed selected themes, build your project with the `-Dnothemes` option, and
you can set two environment variables to control which themes are included:
* `TT_THEMES` is a comma-separated list of themes to include from tartrazine (see the styles directory in the source)
* `SIXTEEN_THEMES` is a comma-separated list of themes to include from Sixteen (see the base16 directory in the sixteen source)
For example (using the tartrazine CLI as the project):
```bash
$ TT_THEMES=colorful,autumn SIXTEEN_THEMES=pasque,pico shards build -Dnothemes
Dependencies are satisfied
Building: tartrazine
$ ./bin/tartrazine --list-themes
autumn
colorful
pasque
pico
```
Be careful not to build without any themes at all, nothing will work.
## Templates for standalone HTML output
If you are using the HTML formatter, you can pass a template to use for the output. The template is a string where the following placeholders will be replaced:
* `{{style_defs}}` will be replaced by the CSS styles needed for the theme
* `{{code}}` will be replaced by the highlighted code
This is an example template that changes the padding around the code:
```jinja2
<!DOCTYPE html>
<html>
<head>
<style>
{{style_defs}}
pre {
padding: 1em;
}
</style>
</head>
<body>
{{body}}
</body>
</html>
```
## Contributing ## Contributing
1. Fork it (<https://github.com/ralsina/tartrazine/fork>) 1. Fork it (<https://github.com/ralsina/tartrazine/fork>)

View File

@ -7,10 +7,10 @@ docker run --rm --privileged \
# Build for AMD64 # Build for AMD64
docker build . -f Dockerfile.static -t tartrazine-builder docker build . -f Dockerfile.static -t tartrazine-builder
docker run -ti --rm -v "$PWD":/app --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release" docker run -ti --rm -v "$PWD":/app --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release && strip bin/tartrazine"
mv bin/tartrazine bin/tartrazine-static-linux-amd64 mv bin/tartrazine bin/tartrazine-static-linux-amd64
# Build for ARM64 # Build for ARM64
docker build . -f Dockerfile.static --platform linux/arm64 -t tartrazine-builder docker build . -f Dockerfile.static --platform linux/arm64 -t tartrazine-builder
docker run -ti --rm -v "$PWD":/app --platform linux/arm64 --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release" docker run -ti --rm -v "$PWD":/app --platform linux/arm64 --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release && strip bin/tartrazine"
mv bin/tartrazine bin/tartrazine-static-linux-arm64 mv bin/tartrazine bin/tartrazine-static-linux-arm64

View File

@ -9,7 +9,7 @@ git add shard.yml
hace lint test hace lint test
git cliff --bump -u -p CHANGELOG.md git cliff --bump -u -p CHANGELOG.md
git commit -a -m "bump: Release v$VERSION" git commit -a -m "bump: Release v$VERSION"
hace static
git tag "v$VERSION" git tag "v$VERSION"
git push --tags git push --tags
hace static
gh release create "v$VERSION" "bin/$PKGNAME-static-linux-amd64" "bin/$PKGNAME-static-linux-arm64" --title "Release v$VERSION" --notes "$(git cliff -l -s all)" gh release create "v$VERSION" "bin/$PKGNAME-static-linux-amd64" "bin/$PKGNAME-static-linux-arm64" --title "Release v$VERSION" --notes "$(git cliff -l -s all)"

165
lexers/atl.xml Normal file
View File

@ -0,0 +1,165 @@
<lexer>
<config>
<name>ATL</name>
<alias>atl</alias>
<filename>*.atl</filename>
<mime_type>text/x-atl</mime_type>
<dot_all>true</dot_all>
</config>
<rules>
<state name="root">
<rule pattern="(--.*?)(\n)">
<bygroups>
<token type="CommentSingle" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(and|distinct|endif|else|for|foreach|if|implies|in|let|not|or|self|super|then|thisModule|xor)\b">
<token type="Keyword" />
</rule>
<rule pattern="(OclUndefined|true|false|#\w+)\b">
<token type="KeywordConstant" />
</rule>
<rule pattern="(module|query|library|create|from|to|uses)\b">
<token type="KeywordNamespace" />
</rule>
<rule pattern="(do)(\s*)({)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
<token type="Punctuation" />
</bygroups>
</rule>
<rule pattern="(abstract|endpoint|entrypoint|lazy|unique)(\s+)">
<bygroups>
<token type="KeywordDeclaration" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(rule)(\s+)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(helper)(\s+)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(context)(\s+)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(def)(\s*)(:)(\s*)">
<bygroups>
<token type="KeywordNamespace" />
<token type="TextWhitespace" />
<token type="Punctuation" />
<token type="TextWhitespace" />
</bygroups>
</rule>
<rule pattern="(Bag|Boolean|Integer|OrderedSet|Real|Sequence|Set|String|Tuple)">
<token type="KeywordType" />
</rule>
<rule pattern="(\w+)(\s*)(&lt;-|&lt;:=)">
<bygroups>
<token type="NameNamespace" />
<token type="TextWhitespace" />
<token type="Punctuation" />
</bygroups>
</rule>
<rule pattern="#&quot;">
<token type="KeywordConstant" />
<push state="quotedenumliteral" />
</rule>
<rule pattern="&quot;">
<token type="NameNamespace" />
<push state="quotedname" />
</rule>
<rule pattern="[^\S\n]+">
<token type="TextWhitespace" />
</rule>
<rule pattern="&#x27;">
<token type="LiteralString" />
<push state="string" />
</rule>
<rule
pattern="[0-9]*\.[0-9]+">
<token type="LiteralNumberFloat" />
</rule>
<rule pattern="0|[1-9][0-9]*">
<token type="LiteralNumberInteger" />
</rule>
<rule pattern="[*&lt;&gt;+=/-]">
<token type="Operator" />
</rule>
<rule pattern="([{}();:.,!|]|-&gt;)">
<token type="Punctuation" />
</rule>
<rule pattern="\n">
<token type="TextWhitespace" />
</rule>
<rule pattern="\w+">
<token type="NameNamespace" />
</rule>
</state>
<state name="string">
<rule pattern="[^\\&#x27;]+">
<token type="LiteralString" />
</rule>
<rule pattern="\\\\">
<token type="LiteralString" />
</rule>
<rule pattern="\\&#x27;">
<token type="LiteralString" />
</rule>
<rule pattern="\\">
<token type="LiteralString" />
</rule>
<rule pattern="&#x27;">
<token type="LiteralString" />
<pop depth="1" />
</rule>
</state>
<state name="quotedname">
<rule pattern="[^\\&quot;]+">
<token type="NameNamespace" />
</rule>
<rule pattern="\\\\">
<token type="NameNamespace" />
</rule>
<rule pattern="\\&quot;">
<token type="NameNamespace" />
</rule>
<rule pattern="\\">
<token type="NameNamespace" />
</rule>
<rule pattern="&quot;">
<token type="NameNamespace" />
<pop depth="1" />
</rule>
</state>
<state name="quotedenumliteral">
<rule pattern="[^\\&quot;]+">
<token type="KeywordConstant" />
</rule>
<rule pattern="\\\\">
<token type="KeywordConstant" />
</rule>
<rule pattern="\\&quot;">
<token type="KeywordConstant" />
</rule>
<rule pattern="\\">
<token type="KeywordConstant" />
</rule>
<rule pattern="&quot;">
<token type="KeywordConstant" />
<pop depth="1" />
</rule>
</state>
</rules>
</lexer>

120
lexers/beef.xml Normal file
View File

@ -0,0 +1,120 @@
<lexer>
<config>
<name>Beef</name>
<alias>beef</alias>
<filename>*.bf</filename>
<mime_type>text/x-beef</mime_type>
<dot_all>true</dot_all>
<ensure_nl>true</ensure_nl>
</config>
<rules>
<state name="root">
<rule pattern="^\s*\[.*?\]">
<token type="NameAttribute"/>
</rule>
<rule pattern="[^\S\n]+">
<token type="Text"/>
</rule>
<rule pattern="\\\n">
<token type="Text"/>
</rule>
<rule pattern="///[^\n\r]*">
<token type="CommentSpecial"/>
</rule>
<rule pattern="//[^\n\r]*">
<token type="CommentSingle"/>
</rule>
<rule pattern="/[*].*?[*]/">
<token type="CommentMultiline"/>
</rule>
<rule pattern="\n">
<token type="Text"/>
</rule>
<rule pattern="[~!%^&amp;*()+=|\[\]:;,.&lt;&gt;/?-]">
<token type="Punctuation"/>
</rule>
<rule pattern="[{}]">
<token type="Punctuation"/>
</rule>
<rule pattern="@&#34;(&#34;&#34;|[^&#34;])*&#34;">
<token type="LiteralString"/>
</rule>
<rule pattern="\$@?&#34;(&#34;&#34;|[^&#34;])*&#34;">
<token type="LiteralString"/>
</rule>
<rule pattern="&#34;(\\\\|\\&#34;|[^&#34;\n])*[&#34;\n]">
<token type="LiteralString"/>
</rule>
<rule pattern="&#39;\\.&#39;|&#39;[^\\]&#39;">
<token type="LiteralStringChar"/>
</rule>
<rule pattern="0[xX][0-9a-fA-F]+[Ll]?|\d[_\d]*(\.\d*)?([eE][+-]?\d+)?[flFLdD]?">
<token type="LiteralNumber"/>
</rule>
<rule pattern="#[ \t]*(if|endif|else|elif|define|undef|line|error|warning|region|endregion|pragma|nullable)\b">
<token type="CommentPreproc"/>
</rule>
<rule pattern="\b(extern)(\s+)(alias)\b">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
<token type="Keyword"/>
</bygroups>
</rule>
<rule pattern="(as|await|base|break|by|case|catch|checked|continue|default|delegate|else|event|finally|fixed|for|repeat|goto|if|in|init|is|let|lock|new|scope|on|out|params|readonly|ref|return|sizeof|stackalloc|switch|this|throw|try|typeof|unchecked|virtual|void|while|get|set|new|yield|add|remove|value|alias|ascending|descending|from|group|into|orderby|select|thenby|where|join|equals)\b">
<token type="Keyword"/>
</rule>
<rule pattern="(global)(::)">
<bygroups>
<token type="Keyword"/>
<token type="Punctuation"/>
</bygroups>
</rule>
<rule pattern="(abstract|async|const|enum|explicit|extern|implicit|internal|operator|override|partial|extension|private|protected|public|static|sealed|unsafe|volatile)\b">
<token type="KeywordDeclaration"/>
</rule>
<rule pattern="(bool|byte|char8|char16|char32|decimal|double|float|int|int8|int16|int32|int64|long|object|sbyte|short|string|uint|uint8|uint16|uint32|uint64|uint|let|var)\b\??">
<token type="KeywordType"/>
</rule>
<rule pattern="(true|false|null)\b">
<token type="KeywordConstant"/>
</rule>
<rule pattern="(class|struct|record|interface)(\s+)">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
</bygroups>
<push state="class"/>
</rule>
<rule pattern="(namespace|using)(\s+)">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
</bygroups>
<push state="namespace"/>
</rule>
<rule pattern="@?[_a-zA-Z]\w*">
<token type="Name"/>
</rule>
</state>
<state name="class">
<rule pattern="@?[_a-zA-Z]\w*">
<token type="NameClass"/>
<pop depth="1"/>
</rule>
<rule>
<pop depth="1"/>
</rule>
</state>
<state name="namespace">
<rule pattern="(?=\()">
<token type="Text"/>
<pop depth="1"/>
</rule>
<rule pattern="(@?[_a-zA-Z]\w*|\.)+">
<token type="NameNamespace"/>
<pop depth="1"/>
</rule>
</state>
</rules>
</lexer>

53
lexers/csv.xml Normal file
View File

@ -0,0 +1,53 @@
<!--
Lexer for RFC-4180 compliant CSV subject to the following additions:
- UTF-8 encoding is accepted (the RFC requires 7-bit ASCII)
- The line terminator character can be LF or CRLF (the RFC allows CRLF only)
Link to the RFC-4180 specification: https://tools.ietf.org/html/rfc4180
Additions inspired by:
https://github.com/frictionlessdata/datapackage/issues/204#issuecomment-193242077
Future improvements:
- Identify non-quoted numbers as LiteralNumber
- Identify y as an error in "x"y. Currently it's identified as another string
literal.
-->
<lexer>
<config>
<name>CSV</name>
<alias>csv</alias>
<filename>*.csv</filename>
<mime_type>text/csv</mime_type>
</config>
<rules>
<state name="root">
<rule pattern="\r?\n">
<token type="Punctuation" />
</rule>
<rule pattern=",">
<token type="Punctuation" />
</rule>
<rule pattern="&quot;">
<token type="LiteralStringDouble" />
<push state="escaped" />
</rule>
<rule pattern="[^\r\n,]+">
<token type="LiteralString" />
</rule>
</state>
<state name="escaped">
<rule pattern="&quot;&quot;">
<token type="LiteralStringEscape"/>
</rule>
<rule pattern="&quot;">
<token type="LiteralStringDouble" />
<pop depth="1"/>
</rule>
<rule pattern="[^&quot;]+">
<token type="LiteralStringDouble" />
</rule>
</state>
</rules>
</lexer>

View File

@ -3,7 +3,6 @@
<name>Groff</name> <name>Groff</name>
<alias>groff</alias> <alias>groff</alias>
<alias>nroff</alias> <alias>nroff</alias>
<alias>roff</alias>
<alias>man</alias> <alias>man</alias>
<filename>*.[1-9]</filename> <filename>*.[1-9]</filename>
<filename>*.1p</filename> <filename>*.1p</filename>

View File

@ -95,19 +95,22 @@
<rule pattern="[:!#$%&amp;*+.\\/&lt;=&gt;?@^|~-]+"> <rule pattern="[:!#$%&amp;*+.\\/&lt;=&gt;?@^|~-]+">
<token type="Operator"/> <token type="Operator"/>
</rule> </rule>
<rule pattern="\d+[eE][+-]?\d+"> <rule pattern="\d+_*[eE][+-]?\d+">
<token type="LiteralNumberFloat"/> <token type="LiteralNumberFloat"/>
</rule> </rule>
<rule pattern="\d+\.\d+([eE][+-]?\d+)?"> <rule pattern="\d+(_+[\d]+)*\.\d+(_+[\d]+)*([eE][+-]?\d+)?">
<token type="LiteralNumberFloat"/> <token type="LiteralNumberFloat"/>
</rule> </rule>
<rule pattern="0[oO][0-7]+"> <rule pattern="0[oO](_*[0-7])+">
<token type="LiteralNumberOct"/> <token type="LiteralNumberOct"/>
</rule> </rule>
<rule pattern="0[xX][\da-fA-F]+"> <rule pattern="0[xX](_*[\da-fA-F])+">
<token type="LiteralNumberHex"/> <token type="LiteralNumberHex"/>
</rule> </rule>
<rule pattern="\d+"> <rule pattern="0[bB](_*[01])+">
<token type="LiteralNumberBin"/>
</rule>
<rule pattern="\d+(_*[\d])*">
<token type="LiteralNumberInteger"/> <token type="LiteralNumberInteger"/>
</rule> </rule>
<rule pattern="&#39;"> <rule pattern="&#39;">

View File

@ -3,6 +3,7 @@
<name>JSON</name> <name>JSON</name>
<alias>json</alias> <alias>json</alias>
<filename>*.json</filename> <filename>*.json</filename>
<filename>*.jsonc</filename>
<filename>*.avsc</filename> <filename>*.avsc</filename>
<mime_type>application/json</mime_type> <mime_type>application/json</mime_type>
<dot_all>true</dot_all> <dot_all>true</dot_all>

137
lexers/jsonnet.xml Normal file
View File

@ -0,0 +1,137 @@
<lexer>
<config>
<name>Jsonnet</name>
<alias>jsonnet</alias>
<filename>*.jsonnet</filename>
<filename>*.libsonnet</filename>
</config>
<rules>
<state name="_comments">
<rule pattern="(//|#).*\n"><token type="CommentSingle"/></rule>
<rule pattern="/\*\*([^/]|/(?!\*))*\*/"><token type="LiteralStringDoc"/></rule>
<rule pattern="/\*([^/]|/(?!\*))*\*/"><token type="Comment"/></rule>
</state>
<state name="root">
<rule><include state="_comments"/></rule>
<rule pattern="@&#x27;.*&#x27;"><token type="LiteralString"/></rule>
<rule pattern="@&quot;.*&quot;"><token type="LiteralString"/></rule>
<rule pattern="&#x27;"><token type="LiteralString"/><push state="singlestring"/></rule>
<rule pattern="&quot;"><token type="LiteralString"/><push state="doublestring"/></rule>
<rule pattern="\|\|\|(.|\n)*\|\|\|"><token type="LiteralString"/></rule>
<rule pattern="[+-]?[0-9]+(.[0-9])?"><token type="LiteralNumberFloat"/></rule>
<rule pattern="[!$~+\-&amp;|^=&lt;&gt;*/%]"><token type="Operator"/></rule>
<rule pattern="\{"><token type="Punctuation"/><push state="object"/></rule>
<rule pattern="\["><token type="Punctuation"/><push state="array"/></rule>
<rule pattern="local\b"><token type="Keyword"/><push state="local_name"/></rule>
<rule pattern="assert\b"><token type="Keyword"/><push state="assert"/></rule>
<rule pattern="(assert|else|error|false|for|if|import|importstr|in|null|tailstrict|then|self|super|true)\b"><token type="Keyword"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="function(?=\()"><token type="Keyword"/><push state="function_params"/></rule>
<rule pattern="std\.[^\W\d]\w*(?=\()"><token type="NameBuiltin"/><push state="function_args"/></rule>
<rule pattern="[^\W\d]\w*(?=\()"><token type="NameFunction"/><push state="function_args"/></rule>
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/></rule>
<rule pattern="[\.()]"><token type="Punctuation"/></rule>
</state>
<state name="singlestring">
<rule pattern="[^&#x27;\\]"><token type="LiteralString"/></rule>
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="&#x27;"><token type="LiteralString"/><pop depth="1"/></rule>
</state>
<state name="doublestring">
<rule pattern="[^&quot;\\]"><token type="LiteralString"/></rule>
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="&quot;"><token type="LiteralString"/><pop depth="1"/></rule>
</state>
<state name="array">
<rule pattern=","><token type="Punctuation"/></rule>
<rule pattern="\]"><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="local_name">
<rule pattern="[^\W\d]\w*(?=\()"><token type="NameFunction"/><push state="function_params"/></rule>
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="(?==)"><token type="TextWhitespace"/><push state="#pop" state="local_value"/></rule>
</state>
<state name="local_value">
<rule pattern="="><token type="Operator"/></rule>
<rule pattern=";"><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="assert">
<rule pattern=":"><token type="Punctuation"/></rule>
<rule pattern=";"><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="function_params">
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/></rule>
<rule pattern="\("><token type="Punctuation"/></rule>
<rule pattern="\)"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern=","><token type="Punctuation"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="="><token type="Operator"/><push state="function_param_default"/></rule>
</state>
<state name="function_args">
<rule pattern="\("><token type="Punctuation"/></rule>
<rule pattern="\)"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern=","><token type="Punctuation"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="object">
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="local\b"><token type="Keyword"/><push state="object_local_name"/></rule>
<rule pattern="assert\b"><token type="Keyword"/><push state="object_assert"/></rule>
<rule pattern="\["><token type="Operator"/><push state="field_name_expr"/></rule>
<rule pattern="(?=[^\W\d]\w*)"><token type="Text"/><push state="field_name"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="&quot;"><token type="NameVariable"/><push state="double_field_name"/></rule>
<rule pattern="&#x27;"><token type="NameVariable"/><push state="single_field_name"/></rule>
<rule><include state="_comments"/></rule>
</state>
<state name="field_name">
<rule pattern="[^\W\d]\w*(?=\()"><token type="NameFunction"/><push state="field_separator" state="function_params"/></rule>
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/><push state="field_separator"/></rule>
</state>
<state name="double_field_name">
<rule pattern="([^&quot;\\]|\\.)*&quot;"><token type="NameVariable"/><push state="field_separator"/></rule>
</state>
<state name="single_field_name">
<rule pattern="([^&#x27;\\]|\\.)*&#x27;"><token type="NameVariable"/><push state="field_separator"/></rule>
</state>
<state name="field_name_expr">
<rule pattern="\]"><token type="Operator"/><push state="field_separator"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="function_param_default">
<rule pattern="(?=[,\)])"><token type="TextWhitespace"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="field_separator">
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="\+?::?:?"><token type="Punctuation"/><push state="#pop" state="#pop" state="field_value"/></rule>
<rule><include state="_comments"/></rule>
</state>
<state name="field_value">
<rule pattern=","><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="2"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="object_assert">
<rule pattern=":"><token type="Punctuation"/></rule>
<rule pattern=","><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="root"/></rule>
</state>
<state name="object_local_name">
<rule pattern="[^\W\d]\w*"><token type="NameVariable"/><push state="#pop" state="object_local_value"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
</state>
<state name="object_local_value">
<rule pattern="="><token type="Operator"/></rule>
<rule pattern=","><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="2"/></rule>
<rule><include state="root"/></rule>
</state>
</rules>
</lexer>

View File

@ -45,7 +45,7 @@
</emitters> </emitters>
</usingbygroup> </usingbygroup>
</rule> </rule>
<rule pattern="(ACCESS|ADD|ADDRESSES|AGGREGATE|ALIGNED|ALL|ALTER|ANALYSIS|AND|ANY|ARITY|ARN|ARRANGEMENT|ARRAY|AS|ASC|ASSERT|ASSUME|AT|AUCTION|AUTHORITY|AVAILABILITY|AVRO|AWS|BATCH|BEGIN|BETWEEN|BIGINT|BILLED|BODY|BOOLEAN|BOTH|BPCHAR|BROKEN|BROKER|BROKERS|BY|BYTES|CARDINALITY|CASCADE|CASE|CAST|CERTIFICATE|CHAIN|CHAINS|CHAR|CHARACTER|CHARACTERISTICS|CHECK|CLIENT|CLOSE|CLUSTER|CLUSTERS|COALESCE|COLLATE|COLUMN|COLUMNS|COMMENT|COMMIT|COMMITTED|COMPACTION|COMPATIBILITY|COMPRESSION|COMPUTE|COMPUTECTL|CONFIG|CONFLUENT|CONNECTION|CONNECTIONS|CONSTRAINT|COPY|COUNT|COUNTER|CREATE|CREATECLUSTER|CREATEDB|CREATEROLE|CREATION|CROSS|CSV|CURRENT|CURSOR|DATABASE|DATABASES|DATUMS|DAY|DAYS|DEALLOCATE|DEBEZIUM|DEBUG|DEBUGGING|DEC|DECIMAL|DECLARE|DECODING|DECORRELATED|DEFAULT|DEFAULTS|DELETE|DELIMITED|DELIMITER|DELTA|DESC|DETAILS|DISCARD|DISK|DISTINCT|DOC|DOT|DOUBLE|DROP|EAGER|ELEMENT|ELSE|ENABLE|END|ENDPOINT|ENFORCED|ENVELOPE|ERROR|ERRORS|ESCAPE|ESTIMATE|EVERY|EXCEPT|EXECUTE|EXISTS|EXPECTED|EXPLAIN|EXPOSE|EXPRESSIONS|EXTERNAL|EXTRACT|FACTOR|FALSE|FAST|FEATURES|FETCH|FIELDS|FILE|FILTER|FIRST|FIXPOINT|FLOAT|FOLLOWING|FOR|FOREIGN|FORMAT|FORWARD|FROM|FULL|FULLNAME|FUNCTION|GENERATOR|GRANT|GREATEST|GROUP|GROUPS|HAVING|HEADER|HEADERS|HISTORY|HOLD|HOST|HOUR|HOURS|HUMANIZED|ID|IDENTIFIERS|IDS|IF|IGNORE|ILIKE|IMPLEMENTATIONS|IMPORTED|IN|INCLUDE|INDEX|INDEXES|INFO|INHERIT|INLINE|INNER|INPUT|INSERT|INSIGHTS|INSPECT|INT|INTEGER|INTERNAL|INTERSECT|INTERVAL|INTO|INTROSPECTION|IS|ISNULL|ISOLATION|JOIN|JOINS|JSON|KAFKA|KEY|KEYS|LAST|LATERAL|LATEST|LEADING|LEAST|LEFT|LEGACY|LETREC|LEVEL|LIKE|LIMIT|LINEAR|LIST|LOAD|LOCAL|LOCALLY|LOG|LOGICAL|LOGIN|LOWERING|MANAGED|MANUAL|MAP|MARKETING|MATERIALIZE|MATERIALIZED|MAX|MECHANISMS|MEMBERSHIP|MESSAGE|METADATA|MINUTE|MINUTES|MODE|MONTH|MONTHS|MUTUALLY|MYSQL|NAME|NAMES|NATURAL|NEGATIVE|NEW|NEXT|NO|NOCREATECLUSTER|NOCREATEDB|NOCREATEROLE|NODE|NOINHERIT|NOLOGIN|NON|NONE|NOSUPERUSER|NOT|NOTICE|NOTICES|NULL|NULLIF|NULLS|OBJECTS|OF|OFFSET|ON|ONLY|OPERATOR|OPTIMIZED|OPTIMIZER|OPTIONS|OR|ORDER|ORDINALITY|OUTER|OVER|OWNED|OWNER|PARTITION|PARTITIONS|PASSWORD|PATH|PHYSICAL|PLAN|PLANS|PORT|POSITION|POSTGRES|PRECEDING|PRECISION|PREFIX|PREPARE|PRIMARY|PRIVATELINK|PRIVILEGES|PROGRESS|PROTOBUF|PROTOCOL|PUBLICATION|PUSHDOWN|QUERY|QUOTE|RAISE|RANGE|RATE|RAW|READ|REAL|REASSIGN|RECURSION|RECURSIVE|REDACTED|REFERENCE|REFERENCES|REFRESH|REGEX|REGION|REGISTRY|REHYDRATION|RENAME|REOPTIMIZE|REPEATABLE|REPLACE|REPLAN|REPLICA|REPLICAS|REPLICATION|RESET|RESPECT|RESTRICT|RETAIN|RETURN|RETURNING|REVOKE|RIGHT|ROLE|ROLES|ROLLBACK|ROTATE|ROUNDS|ROW|ROWS|SASL|SCALE|SCHEDULE|SCHEMA|SCHEMAS|SECOND|SECONDS|SECRET|SECRETS|SECURITY|SEED|SELECT|SEQUENCES|SERIALIZABLE|SERVICE|SESSION|SET|SHARD|SHOW|SINK|SINKS|SIZE|SMALLINT|SNAPSHOT|SOME|SOURCE|SOURCES|SSH|SSL|START|STDIN|STDOUT|STORAGE|STORAGECTL|STRATEGY|STRICT|STRING|STRONG|SUBSCRIBE|SUBSOURCE|SUBSOURCES|SUBSTRING|SUBTREE|SUPERUSER|SWAP|SYNTAX|SYSTEM|TABLE|TABLES|TAIL|TEMP|TEMPORARY|TEXT|THEN|TICK|TIES|TIME|TIMELINE|TIMEOUT|TIMESTAMP|TIMESTAMPTZ|TIMING|TO|TOKEN|TOPIC|TPCH|TRACE|TRAILING|TRANSACTION|TRANSACTIONAL|TRIM|TRUE|TUNNEL|TYPE|TYPES|UNBOUNDED|UNCOMMITTED|UNION|UNIQUE|UNKNOWN|UP|UPDATE|UPSERT|URL|USAGE|USER|USERNAME|USERS|USING|VALIDATE|VALUE|VALUES|VARCHAR|VARIADIC|VARYING|VERSION|VIEW|VIEWS|WARNING|WEBHOOK|WHEN|WHERE|WINDOW|WIRE|WITH|WITHIN|WITHOUT|WORK|WORKERS|WRITE|YEAR|YEARS|ZONE|ZONES)\b"> <rule pattern="(ACCESS|ADD|ADDRESSES|AGGREGATE|ALIGNED|ALL|ALTER|ANALYSIS|AND|ANY|ARITY|ARN|ARRANGEMENT|ARRAY|AS|ASC|ASSERT|ASSUME|AT|AUCTION|AUTHORITY|AVAILABILITY|AVRO|AWS|BATCH|BEGIN|BETWEEN|BIGINT|BILLED|BODY|BOOLEAN|BOTH|BPCHAR|BROKEN|BROKER|BROKERS|BY|BYTES|CARDINALITY|CASCADE|CASE|CAST|CERTIFICATE|CHAIN|CHAINS|CHAR|CHARACTER|CHARACTERISTICS|CHECK|CLASS|CLIENT|CLOCK|CLOSE|CLUSTER|CLUSTERS|COALESCE|COLLATE|COLUMN|COLUMNS|COMMENT|COMMIT|COMMITTED|COMPACTION|COMPATIBILITY|COMPRESSION|COMPUTE|COMPUTECTL|CONFIG|CONFLUENT|CONNECTION|CONNECTIONS|CONSTRAINT|CONTINUAL|COPY|COUNT|COUNTER|CREATE|CREATECLUSTER|CREATEDB|CREATEROLE|CREATION|CROSS|CSV|CURRENT|CURSOR|DATABASE|DATABASES|DATUMS|DAY|DAYS|DEALLOCATE|DEBEZIUM|DEBUG|DEBUGGING|DEC|DECIMAL|DECLARE|DECODING|DECORRELATED|DEFAULT|DEFAULTS|DELETE|DELIMITED|DELIMITER|DELTA|DESC|DETAILS|DISCARD|DISK|DISTINCT|DOC|DOT|DOUBLE|DROP|EAGER|ELEMENT|ELSE|ENABLE|END|ENDPOINT|ENFORCED|ENVELOPE|ERROR|ERRORS|ESCAPE|ESTIMATE|EVERY|EXCEPT|EXCLUDE|EXECUTE|EXISTS|EXPECTED|EXPLAIN|EXPOSE|EXPRESSIONS|EXTERNAL|EXTRACT|FACTOR|FALSE|FAST|FEATURES|FETCH|FIELDS|FILE|FILTER|FIRST|FIXPOINT|FLOAT|FOLLOWING|FOR|FOREIGN|FORMAT|FORWARD|FROM|FULL|FULLNAME|FUNCTION|FUSION|GENERATOR|GRANT|GREATEST|GROUP|GROUPS|HAVING|HEADER|HEADERS|HISTORY|HOLD|HOST|HOUR|HOURS|HUMANIZED|HYDRATION|ID|IDENTIFIERS|IDS|IF|IGNORE|ILIKE|IMPLEMENTATIONS|IMPORTED|IN|INCLUDE|INDEX|INDEXES|INFO|INHERIT|INLINE|INNER|INPUT|INSERT|INSIGHTS|INSPECT|INT|INTEGER|INTERNAL|INTERSECT|INTERVAL|INTO|INTROSPECTION|IS|ISNULL|ISOLATION|JOIN|JOINS|JSON|KAFKA|KEY|KEYS|LAST|LATERAL|LATEST|LEADING|LEAST|LEFT|LEGACY|LETREC|LEVEL|LIKE|LIMIT|LINEAR|LIST|LOAD|LOCAL|LOCALLY|LOG|LOGICAL|LOGIN|LOWERING|MANAGED|MANUAL|MAP|MARKETING|MATERIALIZE|MATERIALIZED|MAX|MECHANISMS|MEMBERSHIP|MESSAGE|METADATA|MINUTE|MINUTES|MODE|MONTH|MONTHS|MUTUALLY|MYSQL|NAME|NAMES|NATURAL|NEGATIVE|NEW|NEXT|NO|NOCREATECLUSTER|NOCREATEDB|NOCREATEROLE|NODE|NOINHERIT|NOLOGIN|NON|NONE|NOSUPERUSER|NOT|NOTICE|NOTICES|NULL|NULLIF|NULLS|OBJECTS|OF|OFFSET|ON|ONLY|OPERATOR|OPTIMIZED|OPTIMIZER|OPTIONS|OR|ORDER|ORDINALITY|OUTER|OVER|OWNED|OWNER|PARTITION|PARTITIONS|PASSWORD|PATH|PHYSICAL|PLAN|PLANS|PORT|POSITION|POSTGRES|PRECEDING|PRECISION|PREFIX|PREPARE|PRIMARY|PRIVATELINK|PRIVILEGES|PROGRESS|PROTOBUF|PROTOCOL|PUBLIC|PUBLICATION|PUSHDOWN|QUERY|QUOTE|RAISE|RANGE|RATE|RAW|READ|READY|REAL|REASSIGN|RECURSION|RECURSIVE|REDACTED|REDUCE|REFERENCE|REFERENCES|REFRESH|REGEX|REGION|REGISTRY|RENAME|REOPTIMIZE|REPEATABLE|REPLACE|REPLAN|REPLICA|REPLICAS|REPLICATION|RESET|RESPECT|RESTRICT|RETAIN|RETURN|RETURNING|REVOKE|RIGHT|ROLE|ROLES|ROLLBACK|ROTATE|ROUNDS|ROW|ROWS|SASL|SCALE|SCHEDULE|SCHEMA|SCHEMAS|SECOND|SECONDS|SECRET|SECRETS|SECURITY|SEED|SELECT|SEQUENCES|SERIALIZABLE|SERVICE|SESSION|SET|SHARD|SHOW|SINK|SINKS|SIZE|SMALLINT|SNAPSHOT|SOME|SOURCE|SOURCES|SSH|SSL|START|STDIN|STDOUT|STORAGE|STORAGECTL|STRATEGY|STRICT|STRING|STRONG|SUBSCRIBE|SUBSOURCE|SUBSOURCES|SUBSTRING|SUBTREE|SUPERUSER|SWAP|SYNTAX|SYSTEM|TABLE|TABLES|TAIL|TASK|TEMP|TEMPORARY|TEXT|THEN|TICK|TIES|TIME|TIMELINE|TIMEOUT|TIMESTAMP|TIMESTAMPTZ|TIMING|TO|TOKEN|TOPIC|TPCH|TRACE|TRAILING|TRANSACTION|TRANSACTIONAL|TRIM|TRUE|TUNNEL|TYPE|TYPES|UNBOUNDED|UNCOMMITTED|UNION|UNIQUE|UNKNOWN|UNNEST|UNTIL|UP|UPDATE|UPSERT|URL|USAGE|USER|USERNAME|USERS|USING|VALIDATE|VALUE|VALUES|VARCHAR|VARIADIC|VARYING|VERSION|VIEW|VIEWS|WAIT|WARNING|WEBHOOK|WHEN|WHERE|WINDOW|WIRE|WITH|WITHIN|WITHOUT|WORK|WORKERS|WORKLOAD|WRITE|YEAR|YEARS|YUGABYTE|ZONE|ZONES)\b">
<token type="Keyword" /> <token type="Keyword" />
</rule> </rule>
<rule pattern="[+*/&lt;&gt;=~!@#%^&amp;|`?-]+"> <rule pattern="[+*/&lt;&gt;=~!@#%^&amp;|`?-]+">

View File

@ -1,182 +1,137 @@
<lexer> <lexer>
<config> <config>
<name>mcfunction</name> <name>MCFunction</name>
<alias>mcfunction</alias> <alias>mcfunction</alias>
<alias>mcf</alias>
<filename>*.mcfunction</filename> <filename>*.mcfunction</filename>
<dot_all>true</dot_all> <mime_type>text/mcfunction</mime_type>
<not_multiline>true</not_multiline>
</config> </config>
<rules> <rules>
<state name="nbtobjectvalue">
<rule pattern="(&#34;(\\\\|\\&#34;|[^&#34;])*&#34;|[a-zA-Z0-9_]+)">
<token type="NameTag"/>
<push state="nbtobjectattribute"/>
</rule>
<rule pattern="\}">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
</state>
<state name="nbtarrayvalue">
<rule>
<include state="nbtvalue"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
</rule>
<rule pattern="\]">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
</state>
<state name="nbtvalue">
<rule>
<include state="simplevalue"/>
</rule>
<rule pattern="\{">
<token type="Punctuation"/>
<push state="nbtobjectvalue"/>
</rule>
<rule pattern="\[">
<token type="Punctuation"/>
<push state="nbtarrayvalue"/>
</rule>
</state>
<state name="argumentvalue">
<rule>
<include state="simplevalue"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
<rule pattern="[}\]]">
<token type="Punctuation"/>
<pop depth="2"/>
</rule>
</state>
<state name="argumentlist">
<rule pattern="(nbt)(={)">
<bygroups>
<token type="NameAttribute"/>
<token type="Punctuation"/>
</bygroups>
<push state="nbtobjectvalue"/>
</rule>
<rule pattern="([A-Za-z0-9/_!]+)(={)">
<bygroups>
<token type="NameAttribute"/>
<token type="Punctuation"/>
</bygroups>
<push state="argumentlist"/>
</rule>
<rule pattern="([A-Za-z0-9/_!]+)(=)">
<bygroups>
<token type="NameAttribute"/>
<token type="Punctuation"/>
</bygroups>
<push state="argumentvalue"/>
</rule>
<rule>
<include state="simplevalue"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
</rule>
<rule pattern="[}\]]">
<token type="Punctuation"/>
<pop depth="1"/>
</rule>
</state>
<state name="root"> <state name="root">
<rule pattern="#.*?\n"> <rule><include state="names"/></rule>
<token type="CommentSingle"/> <rule><include state="comments"/></rule>
</rule> <rule><include state="literals"/></rule>
<rule pattern="/?(geteduclientinfo|clearspawnpoint|defaultgamemode|transferserver|toggledownfall|immutableworld|detectredstone|setidletimeout|playanimation|classroommode|spreadplayers|testforblocks|setmaxplayers|setworldspawn|testforblock|worldbuilder|createagent|worldborder|camerashake|advancement|raytracefog|locatebiome|tickingarea|replaceitem|attributes|spawnpoint|difficulty|experience|scoreboard|whitelist|structure|playsound|stopsound|forceload|spectate|gamerule|function|schedule|wsserver|teleport|position|save-off|particle|setblock|datapack|mobevent|transfer|gamemode|save-all|bossbar|enchant|trigger|collect|execute|weather|teammsg|tpagent|banlist|dropall|publish|tellraw|testfor|save-on|destroy|ability|locate|summon|remove|effect|reload|ban-ip|recipe|pardon|detect|music|clear|clone|event|mixer|debug|title|ride|stop|list|turn|data|team|kick|loot|tell|help|give|flog|fill|move|time|seed|kill|save|item|deop|code|tag|ban|msg|say|tp|me|op|xp|w|place)\b"> <rule><include state="whitespace"/></rule>
<token type="KeywordReserved"/> <rule><include state="property"/></rule>
</rule> <rule><include state="operators"/></rule>
<rule pattern="(@p|@r|@a|@e|@s|@c|@v)"> <rule><include state="selectors"/></rule>
<token type="KeywordConstant"/>
</rule>
<rule pattern="\[">
<token type="Punctuation"/>
<push state="argumentlist"/>
</rule>
<rule pattern="{">
<token type="Punctuation"/>
<push state="nbtobjectvalue"/>
</rule>
<rule pattern="~">
<token type="NameBuiltin"/>
</rule>
<rule pattern="([a-zA-Z_]+:)?[a-zA-Z_]+\b">
<token type="Text"/>
</rule>
<rule pattern="([a-z]+)(\.)([0-9]+)\b">
<bygroups>
<token type="Text"/>
<token type="Punctuation"/>
<token type="LiteralNumber"/>
</bygroups>
</rule>
<rule pattern="([&lt;&gt;=]|&lt;=|&gt;=)">
<token type="Punctuation"/>
</rule>
<rule>
<include state="simplevalue"/>
</rule>
<rule pattern="\s+">
<token type="TextWhitespace"/>
</rule>
</state> </state>
<state name="simplevalue"> <state name="names">
<rule pattern="(true|false)"> <rule pattern="^(\s*)([a-z_]+)"><bygroups><token type="TextWhitespace"/><token type="NameBuiltin"/></bygroups></rule>
<token type="KeywordConstant"/> <rule pattern="(?&lt;=run)\s+[a-z_]+"><token type="NameBuiltin"/></rule>
</rule> <rule pattern="\b[0-9a-fA-F]+(?:-[0-9a-fA-F]+){4}\b"><token type="NameVariable"/></rule>
<rule pattern="[01]b"> <rule><include state="resource-name"/></rule>
<token type="LiteralNumber"/> <rule pattern="[A-Za-z_][\w.#%$]+"><token type="KeywordConstant"/></rule>
</rule> <rule pattern="[#%$][\w.#%$]+"><token type="NameVariableMagic"/></rule>
<rule pattern="-?(0|[1-9]\d*)(\.\d+[eE](\+|-)?\d+|[eE](\+|-)?\d+|\.\d+)">
<token type="LiteralNumberFloat"/>
</rule>
<rule pattern="(-?\d+)(\.\.)(-?\d+)">
<bygroups>
<token type="LiteralNumberInteger"/>
<token type="Punctuation"/>
<token type="LiteralNumberInteger"/>
</bygroups>
</rule>
<rule pattern="-?(0|[1-9]\d*)">
<token type="LiteralNumberInteger"/>
</rule>
<rule pattern="&#34;(\\\\|\\&#34;|[^&#34;])*&#34;">
<token type="LiteralStringDouble"/>
</rule>
<rule pattern="&#39;[^&#39;]+&#39;">
<token type="LiteralStringSingle"/>
</rule>
<rule pattern="([!#]?)(\w+)">
<bygroups>
<token type="Punctuation"/>
<token type="Text"/>
</bygroups>
</rule>
</state> </state>
<state name="nbtobjectattribute"> <state name="resource-name">
<rule> <rule pattern="#?[a-z_][a-z_.-]*:[a-z0-9_./-]+"><token type="NameFunction"/></rule>
<include state="nbtvalue"/> <rule pattern="#?[a-z0-9_\.\-]+\/[a-z0-9_\.\-\/]+"><token type="NameFunction"/></rule>
</rule> </state>
<rule pattern=":"> <state name="whitespace">
<token type="Punctuation"/> <rule pattern="\s+"><token type="TextWhitespace"/></rule>
</rule> </state>
<rule pattern=","> <state name="comments">
<token type="Punctuation"/> <rule pattern="^\s*(#[&gt;!])"><token type="CommentMultiline"/><push state="comments.block" state="comments.block.emphasized"/></rule>
<pop depth="1"/> <rule pattern="#.*$"><token type="CommentSingle"/></rule>
</rule> </state>
<rule pattern="\}"> <state name="comments.block">
<token type="Punctuation"/> <rule pattern="^\s*#[&gt;!]"><token type="CommentMultiline"/><push state="comments.block.emphasized"/></rule>
<pop depth="2"/> <rule pattern="^\s*#"><token type="CommentMultiline"/><push state="comments.block.normal"/></rule>
</rule> <rule><pop depth="1"/></rule>
</state>
<state name="comments.block.normal">
<rule><include state="comments.block.special"/></rule>
<rule pattern="\S+"><token type="CommentMultiline"/></rule>
<rule pattern="\n"><token type="Text"/><pop depth="1"/></rule>
<rule><include state="whitespace"/></rule>
</state>
<state name="comments.block.emphasized">
<rule><include state="comments.block.special"/></rule>
<rule pattern="\S+"><token type="LiteralStringDoc"/></rule>
<rule pattern="\n"><token type="Text"/><pop depth="1"/></rule>
<rule><include state="whitespace"/></rule>
</state>
<state name="comments.block.special">
<rule pattern="@\S+"><token type="NameDecorator"/></rule>
<rule><include state="resource-name"/></rule>
<rule pattern="[#%$][\w.#%$]+"><token type="NameVariableMagic"/></rule>
</state>
<state name="operators">
<rule pattern="[\-~%^?!+*&lt;&gt;\\/|&amp;=.]"><token type="Operator"/></rule>
</state>
<state name="literals">
<rule pattern="\.\."><token type="Literal"/></rule>
<rule pattern="(true|false)"><token type="KeywordPseudo"/></rule>
<rule pattern="[A-Za-z_]+"><token type="NameVariableClass"/></rule>
<rule pattern="[0-7]b"><token type="LiteralNumberByte"/></rule>
<rule pattern="[+-]?\d*\.?\d+([eE]?[+-]?\d+)?[df]?\b"><token type="LiteralNumberFloat"/></rule>
<rule pattern="[+-]?\d+\b"><token type="LiteralNumberInteger"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><push state="literals.string-double"/></rule>
<rule pattern="&#x27;"><token type="LiteralStringSingle"/><push state="literals.string-single"/></rule>
</state>
<state name="literals.string-double">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&quot;\n]+"><token type="LiteralStringDouble"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><pop depth="1"/></rule>
</state>
<state name="literals.string-single">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&#x27;\n]+"><token type="LiteralStringSingle"/></rule>
<rule pattern="&#x27;"><token type="LiteralStringSingle"/><pop depth="1"/></rule>
</state>
<state name="selectors">
<rule pattern="@[a-z]"><token type="NameVariable"/></rule>
</state>
<state name="property">
<rule pattern="\{"><token type="Punctuation"/><push state="property.curly" state="property.key"/></rule>
<rule pattern="\["><token type="Punctuation"/><push state="property.square" state="property.key"/></rule>
</state>
<state name="property.curly">
<rule><include state="whitespace"/></rule>
<rule><include state="property"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="1"/></rule>
</state>
<state name="property.square">
<rule><include state="whitespace"/></rule>
<rule><include state="property"/></rule>
<rule pattern="\]"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern=","><token type="Punctuation"/></rule>
</state>
<state name="property.key">
<rule><include state="whitespace"/></rule>
<rule pattern="#?[a-z_][a-z_\.\-]*\:[a-z0-9_\.\-/]+(?=\s*\=)"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="#?[a-z_][a-z0-9_\.\-/]+"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="[A-Za-z_\-\+]+"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="&quot;"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="&#x27;"><token type="NameAttribute"/><push state="property.delimiter"/></rule>
<rule pattern="-?\d+"><token type="LiteralNumberInteger"/><push state="property.delimiter"/></rule>
<rule><pop depth="1"/></rule>
</state>
<state name="property.key.string-double">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&quot;\n]+"><token type="NameAttribute"/></rule>
<rule pattern="&quot;"><token type="NameAttribute"/><pop depth="1"/></rule>
</state>
<state name="property.key.string-single">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&#x27;\n]+"><token type="NameAttribute"/></rule>
<rule pattern="&#x27;"><token type="NameAttribute"/><pop depth="1"/></rule>
</state>
<state name="property.delimiter">
<rule><include state="whitespace"/></rule>
<rule pattern="[:=]!?"><token type="Punctuation"/><push state="property.value"/></rule>
<rule pattern=","><token type="Punctuation"/></rule>
<rule><pop depth="1"/></rule>
</state>
<state name="property.value">
<rule><include state="whitespace"/></rule>
<rule pattern="#?[a-z_][a-z_\.\-]*\:[a-z0-9_\.\-/]+"><token type="NameTag"/></rule>
<rule pattern="#?[a-z_][a-z0-9_\.\-/]+"><token type="NameTag"/></rule>
<rule><include state="literals"/></rule>
<rule><include state="property"/></rule>
<rule><pop depth="1"/></rule>
</state> </state>
</rules> </rules>
</lexer> </lexer>

View File

@ -106,7 +106,7 @@
</bygroups> </bygroups>
<push state="interpol"/> <push state="interpol"/>
</rule> </rule>
<rule pattern="(&amp;&amp;|&gt;=|&lt;=|\+\+|-&gt;|!=|\|\||//|==|@|!|\+|\?|&lt;|\.|&gt;|\*)"> <rule pattern="(&amp;&amp;|&gt;=|&lt;=|\+\+|-&gt;|!=|=|\|\||//|==|@|!|\+|\?|&lt;|\.|&gt;|\*)">
<token type="Operator"/> <token type="Operator"/>
</rule> </rule>
<rule pattern="[;:]"> <rule pattern="[;:]">

59
lexers/nsis.xml Normal file
View File

@ -0,0 +1,59 @@
<lexer>
<config>
<name>NSIS</name>
<alias>nsis</alias>
<alias>nsi</alias>
<alias>nsh</alias>
<filename>*.nsi</filename>
<filename>*.nsh</filename>
<mime_type>text/x-nsis</mime_type>
<case_insensitive>true</case_insensitive>
<not_multiline>true</not_multiline>
</config>
<rules>
<state name="root">
<rule pattern="([;#].*)(\n)"><bygroups><token type="Comment"/><token type="TextWhitespace"/></bygroups></rule>
<rule pattern="&#x27;.*?&#x27;"><token type="LiteralStringSingle"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><push state="str_double"/></rule>
<rule pattern="`"><token type="LiteralStringBacktick"/><push state="str_backtick"/></rule>
<rule><include state="macro"/></rule>
<rule><include state="interpol"/></rule>
<rule><include state="basic"/></rule>
<rule pattern="\$\{[a-z_|][\w|]*\}"><token type="KeywordPseudo"/></rule>
<rule pattern="/[a-z_]\w*"><token type="NameAttribute"/></rule>
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
<rule pattern="[\w.]+"><token type="Text"/></rule>
</state>
<state name="basic">
<rule pattern="(\n)(Function)(\s+)([._a-z][.\w]*)\b"><bygroups><token type="TextWhitespace"/><token type="Keyword"/><token type="TextWhitespace"/><token type="NameFunction"/></bygroups></rule>
<rule pattern="\b([_a-z]\w*)(::)([a-z][a-z0-9]*)\b"><bygroups><token type="KeywordNamespace"/><token type="Punctuation"/><token type="NameFunction"/></bygroups></rule>
<rule pattern="\b([_a-z]\w*)(:)"><bygroups><token type="NameLabel"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="(\b[ULS]|\B)([!&lt;&gt;=]?=|\&lt;\&gt;?|\&gt;)\B"><token type="Operator"/></rule>
<rule pattern="[|+-]"><token type="Operator"/></rule>
<rule pattern="\\"><token type="Punctuation"/></rule>
<rule pattern="\b(Abort|Add(?:BrandingImage|Size)|Allow(?:RootDirInstall|SkipFiles)|AutoCloseWindow|BG(?:Font|Gradient)|BrandingText|BringToFront|Call(?:InstDLL)?|(?:Sub)?Caption|ChangeUI|CheckBitmap|ClearErrors|CompletedText|ComponentText|CopyFiles|CRCCheck|Create(?:Directory|Font|Shortcut)|Delete(?:INI(?:Sec|Str)|Reg(?:Key|Value))?|DetailPrint|DetailsButtonText|Dir(?:Show|Text|Var|Verify)|(?:Disabled|Enabled)Bitmap|EnableWindow|EnumReg(?:Key|Value)|Exch|Exec(?:Shell|Wait)?|ExpandEnvStrings|File(?:BufSize|Close|ErrorText|Open|Read(?:Byte)?|Seek|Write(?:Byte)?)?|Find(?:Close|First|Next|Window)|FlushINI|Function(?:End)?|Get(?:CurInstType|CurrentAddress|DlgItem|DLLVersion(?:Local)?|ErrorLevel|FileTime(?:Local)?|FullPathName|FunctionAddress|InstDirError|LabelAddress|TempFileName)|Goto|HideWindow|Icon|If(?:Abort|Errors|FileExists|RebootFlag|Silent)|InitPluginsDir|Install(?:ButtonText|Colors|Dir(?:RegKey)?)|Inst(?:ProgressFlags|Type(?:[GS]etText)?)|Int(?:CmpU?|Fmt|Op)|IsWindow|LangString(?:UP)?|License(?:BkColor|Data|ForceSelection|LangString|Text)|LoadLanguageFile|LockWindow|Log(?:Set|Text)|MessageBox|MiscButtonText|Name|Nop|OutFile|(?:Uninst)?Page(?:Ex(?:End)?)?|PluginDir|Pop|Push|Quit|Read(?:(?:Env|INI|Reg)Str|RegDWORD)|Reboot|(?:Un)?RegDLL|Rename|RequestExecutionLevel|ReserveFile|Return|RMDir|SearchPath|Section(?:Divider|End|(?:(?:Get|Set)(?:Flags|InstTypes|Size|Text))|Group(?:End)?|In)?|SendMessage|Set(?:AutoClose|BrandingImage|Compress(?:ionLevel|or(?:DictSize)?)?|CtlColors|CurInstType|DatablockOptimize|DateSave|Details(?:Print|View)|Error(?:s|Level)|FileAttributes|Font|OutPath|Overwrite|PluginUnload|RebootFlag|ShellVarContext|Silent|StaticBkColor)|Show(?:(?:I|Uni)nstDetails|Window)|Silent(?:Un)?Install|Sleep|SpaceTexts|Str(?:CmpS?|Cpy|Len)|SubSection(?:End)?|Uninstall(?:ButtonText|(?:Sub)?Caption|EXEName|Icon|Text)|UninstPage|Var|VI(?:AddVersionKey|ProductVersion)|WindowIcon|Write(?:INIStr|Reg(:?Bin|DWORD|(?:Expand)?Str)|Uninstaller)|XPStyle)\b"><token type="Keyword"/></rule>
<rule pattern="\b(CUR|END|(?:FILE_ATTRIBUTE_)?(?:ARCHIVE|HIDDEN|NORMAL|OFFLINE|READONLY|SYSTEM|TEMPORARY)|HK(CC|CR|CU|DD|LM|PD|U)|HKEY_(?:CLASSES_ROOT|CURRENT_(?:CONFIG|USER)|DYN_DATA|LOCAL_MACHINE|PERFORMANCE_DATA|USERS)|ID(?:ABORT|CANCEL|IGNORE|NO|OK|RETRY|YES)|MB_(?:ABORTRETRYIGNORE|DEFBUTTON[1-4]|ICON(?:EXCLAMATION|INFORMATION|QUESTION|STOP)|OK(?:CANCEL)?|RETRYCANCEL|RIGHT|SETFOREGROUND|TOPMOST|USERICON|YESNO(?:CANCEL)?)|SET|SHCTX|SW_(?:HIDE|SHOW(?:MAXIMIZED|MINIMIZED|NORMAL))|admin|all|auto|both|bottom|bzip2|checkbox|colored|current|false|force|hide|highest|if(?:diff|newer)|lastused|leave|left|listonly|lzma|nevershow|none|normal|off|on|pop|push|radiobuttons|right|show|silent|silentlog|smooth|textonly|top|true|try|user|zlib)\b"><token type="NameConstant"/></rule>
</state>
<state name="macro">
<rule pattern="\!(addincludedir(?:dir)?|addplugindir|appendfile|cd|define|delfilefile|echo(?:message)?|else|endif|error|execute|if(?:macro)?n?(?:def)?|include|insertmacro|macro(?:end)?|packhdr|search(?:parse|replace)|system|tempfilesymbol|undef|verbose|warning)\b"><token type="CommentPreproc"/></rule>
</state>
<state name="interpol">
<rule pattern="\$(R?[0-9])"><token type="NameBuiltinPseudo"/></rule>
<rule pattern="\$(ADMINTOOLS|APPDATA|CDBURN_AREA|COOKIES|COMMONFILES(?:32|64)|DESKTOP|DOCUMENTS|EXE(?:DIR|FILE|PATH)|FAVORITES|FONTS|HISTORY|HWNDPARENT|INTERNET_CACHE|LOCALAPPDATA|MUSIC|NETHOOD|PICTURES|PLUGINSDIR|PRINTHOOD|PROFILE|PROGRAMFILES(?:32|64)|QUICKLAUNCH|RECENT|RESOURCES(?:_LOCALIZED)?|SENDTO|SM(?:PROGRAMS|STARTUP)|STARTMENU|SYSDIR|TEMP(?:LATES)?|VIDEOS|WINDIR|\{NSISDIR\})"><token type="NameBuiltin"/></rule>
<rule pattern="\$(CMDLINE|INSTDIR|OUTDIR|LANGUAGE)"><token type="NameVariableGlobal"/></rule>
<rule pattern="\$[a-z_]\w*"><token type="NameVariable"/></rule>
</state>
<state name="str_double">
<rule pattern="&quot;"><token type="LiteralStringDouble"/><pop depth="1"/></rule>
<rule pattern="\$(\\[nrt&quot;]|\$)"><token type="LiteralStringEscape"/></rule>
<rule><include state="interpol"/></rule>
<rule pattern="[^&quot;]+"><token type="LiteralStringDouble"/></rule>
</state>
<state name="str_backtick">
<rule pattern="`"><token type="LiteralStringDouble"/><pop depth="1"/></rule>
<rule pattern="\$(\\[nrt&quot;]|\$)"><token type="LiteralStringEscape"/></rule>
<rule><include state="interpol"/></rule>
<rule pattern="[^`]+"><token type="LiteralStringDouble"/></rule>
</state>
</rules>
</lexer>

View File

@ -41,6 +41,14 @@
<rule pattern="\b(as|assert|begin|class|constraint|do|done|downto|else|end|exception|external|false|for|fun|function|functor|if|in|include|inherit|initializer|lazy|let|match|method|module|mutable|new|object|of|open|private|raise|rec|sig|struct|then|to|true|try|type|value|val|virtual|when|while|with)\b"> <rule pattern="\b(as|assert|begin|class|constraint|do|done|downto|else|end|exception|external|false|for|fun|function|functor|if|in|include|inherit|initializer|lazy|let|match|method|module|mutable|new|object|of|open|private|raise|rec|sig|struct|then|to|true|try|type|value|val|virtual|when|while|with)\b">
<token type="Keyword"/> <token type="Keyword"/>
</rule> </rule>
<rule pattern="({([a-z_]*)\|)([\s\S]+?)(?=\|\2})(\|\2})">
<bygroups>
<token type="LiteralStringAffix"/>
<token type="Ignore"/>
<token type="LiteralString"/>
<token type="LiteralStringAffix"/>
</bygroups>
</rule>
<rule pattern="(~|\}|\|]|\||\{&lt;|\{|`|_|]|\[\||\[&gt;|\[&lt;|\[|\?\?|\?|&gt;\}|&gt;]|&gt;|=|&lt;-|&lt;|;;|;|:&gt;|:=|::|:|\.\.|\.|-&gt;|-\.|-|,|\+|\*|\)|\(|&amp;&amp;|&amp;|#|!=)"> <rule pattern="(~|\}|\|]|\||\{&lt;|\{|`|_|]|\[\||\[&gt;|\[&lt;|\[|\?\?|\?|&gt;\}|&gt;]|&gt;|=|&lt;-|&lt;|;;|;|:&gt;|:=|::|:|\.\.|\.|-&gt;|-\.|-|,|\+|\*|\)|\(|&amp;&amp;|&amp;|#|!=)">
<token type="Operator"/> <token type="Operator"/>
</rule> </rule>

View File

@ -51,6 +51,20 @@
<rule pattern = "\#[a-zA-Z_]+\b"> <rule pattern = "\#[a-zA-Z_]+\b">
<token type = "NameDecorator"/> <token type = "NameDecorator"/>
</rule> </rule>
<rule pattern = "^\#\+\w+\s*$">
<token type = "NameAttribute"/>
</rule>
<rule pattern = "^(\#\+\w+)(\s+)(\!)?([A-Za-z0-9-_!]+)(?:(,)(\!)?([A-Za-z0-9-_!]+))*\s*$">
<bygroups>
<token type = "NameAttribute"/>
<token type = "TextWhitespace"/>
<token type = "Operator"/>
<token type = "Name"/>
<token type = "Punctuation"/>
<token type = "Operator"/>
<token type = "Name"/>
</bygroups>
</rule>
<rule pattern = "\@(\([a-zA-Z_]+\b\s*.*\)|\(?[a-zA-Z_]+\)?)"> <rule pattern = "\@(\([a-zA-Z_]+\b\s*.*\)|\(?[a-zA-Z_]+\)?)">
<token type = "NameAttribute"/> <token type = "NameAttribute"/>
</rule> </rule>

57
lexers/snbt.xml Normal file
View File

@ -0,0 +1,57 @@
<lexer>
<config>
<name>SNBT</name>
<alias>snbt</alias>
<filename>*.snbt</filename>
<mime_type>text/snbt</mime_type>
</config>
<rules>
<state name="root">
<rule pattern="\{"><token type="Punctuation"/><push state="compound"/></rule>
<rule pattern="[^\{]+"><token type="Text"/></rule>
</state>
<state name="whitespace">
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
</state>
<state name="operators">
<rule pattern="[,:;]"><token type="Punctuation"/></rule>
</state>
<state name="literals">
<rule pattern="(true|false)"><token type="KeywordConstant"/></rule>
<rule pattern="-?\d+[eE]-?\d+"><token type="LiteralNumberFloat"/></rule>
<rule pattern="-?\d*\.\d+[fFdD]?"><token type="LiteralNumberFloat"/></rule>
<rule pattern="-?\d+[bBsSlLfFdD]?"><token type="LiteralNumberInteger"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><push state="literals.string_double"/></rule>
<rule pattern="&#x27;"><token type="LiteralStringSingle"/><push state="literals.string_single"/></rule>
</state>
<state name="literals.string_double">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&quot;\n]+"><token type="LiteralStringDouble"/></rule>
<rule pattern="&quot;"><token type="LiteralStringDouble"/><pop depth="1"/></rule>
</state>
<state name="literals.string_single">
<rule pattern="\\."><token type="LiteralStringEscape"/></rule>
<rule pattern="[^\\&#x27;\n]+"><token type="LiteralStringSingle"/></rule>
<rule pattern="&#x27;"><token type="LiteralStringSingle"/><pop depth="1"/></rule>
</state>
<state name="compound">
<rule pattern="[A-Z_a-z]+"><token type="NameAttribute"/></rule>
<rule><include state="operators"/></rule>
<rule><include state="whitespace"/></rule>
<rule><include state="literals"/></rule>
<rule pattern="\{"><token type="Punctuation"/><push/></rule>
<rule pattern="\["><token type="Punctuation"/><push state="list"/></rule>
<rule pattern="\}"><token type="Punctuation"/><pop depth="1"/></rule>
</state>
<state name="list">
<rule pattern="[A-Z_a-z]+"><token type="NameAttribute"/></rule>
<rule><include state="literals"/></rule>
<rule><include state="operators"/></rule>
<rule><include state="whitespace"/></rule>
<rule pattern="\["><token type="Punctuation"/><push/></rule>
<rule pattern="\{"><token type="Punctuation"/><push state="compound"/></rule>
<rule pattern="\]"><token type="Punctuation"/><pop depth="1"/></rule>
</state>
</rules>
</lexer>

View File

@ -157,8 +157,20 @@
<rule pattern="(continue|returns|storage|memory|delete|return|throw|break|catch|while|else|from|new|try|for|if|is|as|do|in|_)\b"> <rule pattern="(continue|returns|storage|memory|delete|return|throw|break|catch|while|else|from|new|try|for|if|is|as|do|in|_)\b">
<token type="Keyword"/> <token type="Keyword"/>
</rule> </rule>
<rule pattern="assembly\b"> <rule pattern="(assembly)(\s+\()(.+)(\)\s+{)">
<token type="Keyword"/> <bygroups>
<token type="Keyword"/>
<token type="Text"/>
<token type="LiteralString"/>
<token type="Text"/>
</bygroups>
<push state="assembly"/>
</rule>
<rule pattern="(assembly)(\s+{)">
<bygroups>
<token type="Keyword"/>
<token type="Text"/>
</bygroups>
<push state="assembly"/> <push state="assembly"/>
</rule> </rule>
<rule pattern="(contract|interface|enum|event|struct)(\s+)([a-zA-Z_]\w*)"> <rule pattern="(contract|interface|enum|event|struct)(\s+)([a-zA-Z_]\w*)">
@ -235,7 +247,7 @@
<token type="Punctuation"/> <token type="Punctuation"/>
<pop depth="1"/> <pop depth="1"/>
</rule> </rule>
<rule pattern="[(),]"> <rule pattern="[(),.]">
<token type="Punctuation"/> <token type="Punctuation"/>
</rule> </rule>
<rule pattern=":=|=:"> <rule pattern=":=|=:">

View File

@ -51,6 +51,22 @@
</rule> </rule>
</state> </state>
<state name="tag"> <state name="tag">
<rule>
<include state="jsx"/>
</rule>
<rule pattern=",">
<token type="Punctuation"/>
</rule>
<rule pattern="&#34;(\\\\|\\&#34;|[^&#34;])*&#34;">
<token type="LiteralStringDouble"/>
</rule>
<rule pattern="&#39;(\\\\|\\&#39;|[^&#39;])*&#39;">
<token type="LiteralStringSingle"/>
</rule>
<rule pattern="`">
<token type="LiteralStringBacktick"/>
<push state="interp"/>
</rule>
<rule> <rule>
<include state="commentsandwhitespace"/> <include state="commentsandwhitespace"/>
</rule> </rule>
@ -171,7 +187,7 @@
</rule> </rule>
<rule pattern="(?=/)"> <rule pattern="(?=/)">
<token type="Text"/> <token type="Text"/>
<push state="#pop" state="badregex"/> <push state="badregex"/>
</rule> </rule>
<rule> <rule>
<pop depth="1"/> <pop depth="1"/>

107
lexers/typst.xml Normal file
View File

@ -0,0 +1,107 @@
<lexer>
<config>
<name>Typst</name>
<alias>typst</alias>
<filename>*.typ</filename>
<mime_type>text/x-typst</mime_type>
</config>
<rules>
<state name="root">
<rule><include state="markup"/></rule>
</state>
<state name="into_code">
<rule pattern="(\#let|\#set|\#show)\b"><token type="KeywordDeclaration"/><push state="inline_code"/></rule>
<rule pattern="(\#import|\#include)\b"><token type="KeywordNamespace"/><push state="inline_code"/></rule>
<rule pattern="(\#if|\#for|\#while|\#export)\b"><token type="KeywordReserved"/><push state="inline_code"/></rule>
<rule pattern="#\{"><token type="Punctuation"/><push state="code"/></rule>
<rule pattern="#\("><token type="Punctuation"/><push state="code"/></rule>
<rule pattern="(#[a-zA-Z_][a-zA-Z0-9_-]*)(\[)"><bygroups><token type="NameFunction"/><token type="Punctuation"/></bygroups><push state="markup"/></rule>
<rule pattern="(#[a-zA-Z_][a-zA-Z0-9_-]*)(\()"><bygroups><token type="NameFunction"/><token type="Punctuation"/></bygroups><push state="code"/></rule>
<rule pattern="(\#true|\#false|\#none|\#auto)\b"><token type="KeywordConstant"/></rule>
<rule pattern="#[a-zA-Z_][a-zA-Z0-9_]*"><token type="NameVariable"/></rule>
<rule pattern="#0x[0-9a-fA-F]+"><token type="LiteralNumberHex"/></rule>
<rule pattern="#0b[01]+"><token type="LiteralNumberBin"/></rule>
<rule pattern="#0o[0-7]+"><token type="LiteralNumberOct"/></rule>
<rule pattern="#[0-9]+[\.e][0-9]+"><token type="LiteralNumberFloat"/></rule>
<rule pattern="#[0-9]+"><token type="LiteralNumberInteger"/></rule>
</state>
<state name="markup">
<rule><include state="comment"/></rule>
<rule pattern="^\s*=+.*$"><token type="GenericHeading"/></rule>
<rule pattern="[*][^*]*[*]"><token type="GenericStrong"/></rule>
<rule pattern="_[^_]*_"><token type="GenericEmph"/></rule>
<rule pattern="\$"><token type="Punctuation"/><push state="math"/></rule>
<rule pattern="`[^`]*`"><token type="LiteralStringBacktick"/></rule>
<rule pattern="^(\s*)(-)(\s+)"><bygroups><token type="TextWhitespace"/><token type="Punctuation"/><token type="TextWhitespace"/></bygroups></rule>
<rule pattern="^(\s*)(\+)(\s+)"><bygroups><token type="TextWhitespace"/><token type="Punctuation"/><token type="TextWhitespace"/></bygroups></rule>
<rule pattern="^(\s*)([0-9]+\.)"><bygroups><token type="TextWhitespace"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="^(\s*)(/)(\s+)([^:]+)(:)"><bygroups><token type="TextWhitespace"/><token type="Punctuation"/><token type="TextWhitespace"/><token type="NameVariable"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="&lt;[a-zA-Z_][a-zA-Z0-9_-]*&gt;"><token type="NameLabel"/></rule>
<rule pattern="@[a-zA-Z_][a-zA-Z0-9_-]*"><token type="NameLabel"/></rule>
<rule pattern="\\#"><token type="Text"/></rule>
<rule><include state="into_code"/></rule>
<rule pattern="```(?:.|\n)*?```"><token type="LiteralStringBacktick"/></rule>
<rule pattern="https?://[0-9a-zA-Z~/%#&amp;=\&#x27;,;.+?]*"><token type="GenericEmph"/></rule>
<rule pattern="(\-\-\-|\\|\~|\-\-|\.\.\.)\B"><token type="Punctuation"/></rule>
<rule pattern="\\\["><token type="Punctuation"/></rule>
<rule pattern="\\\]"><token type="Punctuation"/></rule>
<rule pattern="\["><token type="Punctuation"/><push/></rule>
<rule pattern="\]"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="[ \t]+\n?|\n"><token type="TextWhitespace"/></rule>
<rule pattern="((?![*_$`&lt;@\\#\] ]|https?://).)+"><token type="Text"/></rule>
</state>
<state name="math">
<rule><include state="comment"/></rule>
<rule pattern="(\\_|\\\^|\\\&amp;)"><token type="Text"/></rule>
<rule pattern="(_|\^|\&amp;|;)"><token type="Punctuation"/></rule>
<rule pattern="(\+|/|=|\[\||\|\]|\|\||\*|:=|::=|\.\.\.|&#x27;|\-|=:|!=|&gt;&gt;|&gt;=|&gt;&gt;&gt;|&lt;&lt;|&lt;=|&lt;&lt;&lt;|\-&gt;|\|\-&gt;|=&gt;|\|=&gt;|==&gt;|\-\-&gt;|\~\~&gt;|\~&gt;|&gt;\-&gt;|\-&gt;&gt;|&lt;\-|&lt;==|&lt;\-\-|&lt;\~\~|&lt;\~|&lt;\-&lt;|&lt;&lt;\-|&lt;\-&gt;|&lt;=&gt;|&lt;==&gt;|&lt;\-\-&gt;|&gt;|&lt;|\~|:|\|)"><token type="Operator"/></rule>
<rule pattern="\\"><token type="Punctuation"/></rule>
<rule pattern="\\\$"><token type="Punctuation"/></rule>
<rule pattern="\$"><token type="Punctuation"/><pop depth="1"/></rule>
<rule><include state="into_code"/></rule>
<rule pattern="([a-zA-Z][a-zA-Z0-9-]*)(\s*)(\()"><bygroups><token type="NameFunction"/><token type="TextWhitespace"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="([a-zA-Z][a-zA-Z0-9-]*)(:)"><bygroups><token type="NameVariable"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="([a-zA-Z][a-zA-Z0-9-]*)"><token type="NameVariable"/></rule>
<rule pattern="[0-9]+(\.[0-9]+)?"><token type="LiteralNumber"/></rule>
<rule pattern="\.{1,3}|\(|\)|,|\{|\}"><token type="Punctuation"/></rule>
<rule pattern="&quot;[^&quot;]*&quot;"><token type="LiteralStringDouble"/></rule>
<rule pattern="[ \t\n]+"><token type="TextWhitespace"/></rule>
</state>
<state name="comment">
<rule pattern="//.*$"><token type="CommentSingle"/></rule>
<rule pattern="/[*](.|\n)*?[*]/"><token type="CommentMultiline"/></rule>
</state>
<state name="code">
<rule><include state="comment"/></rule>
<rule pattern="\["><token type="Punctuation"/><push state="markup"/></rule>
<rule pattern="\(|\{"><token type="Punctuation"/><push state="code"/></rule>
<rule pattern="\)|\}"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="&quot;[^&quot;]*&quot;"><token type="LiteralStringDouble"/></rule>
<rule pattern=",|\.{1,2}"><token type="Punctuation"/></rule>
<rule pattern="="><token type="Operator"/></rule>
<rule pattern="(and|or|not)\b"><token type="OperatorWord"/></rule>
<rule pattern="=&gt;|&lt;=|==|!=|&gt;|&lt;|-=|\+=|\*=|/=|\+|-|\\|\*"><token type="Operator"/></rule>
<rule pattern="([a-zA-Z_][a-zA-Z0-9_-]*)(:)"><bygroups><token type="NameVariable"/><token type="Punctuation"/></bygroups></rule>
<rule pattern="([a-zA-Z_][a-zA-Z0-9_-]*)(\()"><bygroups><token type="NameFunction"/><token type="Punctuation"/></bygroups><push state="code"/></rule>
<rule pattern="(as|break|export|continue|else|for|if|in|return|while)\b"><token type="KeywordReserved"/></rule>
<rule pattern="(import|include)\b"><token type="KeywordNamespace"/></rule>
<rule pattern="(auto|none|true|false)\b"><token type="KeywordConstant"/></rule>
<rule pattern="([0-9.]+)(mm|pt|cm|in|em|fr|%)"><bygroups><token type="LiteralNumber"/><token type="KeywordReserved"/></bygroups></rule>
<rule pattern="0x[0-9a-fA-F]+"><token type="LiteralNumberHex"/></rule>
<rule pattern="0b[01]+"><token type="LiteralNumberBin"/></rule>
<rule pattern="0o[0-7]+"><token type="LiteralNumberOct"/></rule>
<rule pattern="[0-9]+[\.e][0-9]+"><token type="LiteralNumberFloat"/></rule>
<rule pattern="[0-9]+"><token type="LiteralNumberInteger"/></rule>
<rule pattern="(let|set|show)\b"><token type="KeywordDeclaration"/></rule>
<rule pattern="([a-zA-Z_][a-zA-Z0-9_-]*)"><token type="NameVariable"/></rule>
<rule pattern="[ \t\n]+"><token type="TextWhitespace"/></rule>
<rule pattern=":"><token type="Punctuation"/></rule>
</state>
<state name="inline_code">
<rule pattern=";\b"><token type="Punctuation"/><pop depth="1"/></rule>
<rule pattern="\n"><token type="TextWhitespace"/><pop depth="1"/></rule>
<rule><include state="code"/></rule>
</state>
</rules>
</lexer>

283
lexers/webvtt.xml Normal file
View File

@ -0,0 +1,283 @@
<lexer>
<config>
<name>WebVTT</name>
<alias>vtt</alias>
<filename>*.vtt</filename>
<mime_type>text/vtt</mime_type>
</config>
<!--
The WebVTT spec refers to a WebVTT line terminator as either CRLF, CR or LF.
(https://www.w3.org/TR/webvtt1/#webvtt-line-terminator) However, with this
definition it is unclear whether CRLF is one line terminator (CRLF) or two
line terminators (CR and LF).
To work around this ambiguity, only CRLF and LF are considered as line terminators.
To my knowledge only classic Mac OS uses CR as line terminators, so the lexer should
still work for most files.
-->
<rules>
<!-- https://www.w3.org/TR/webvtt1/#webvtt-file-body -->
<state name="root">
<rule pattern="(\AWEBVTT)((?:[ \t][^\r\n]*)?(?:\r?\n){2,})">
<bygroups>
<token type="Keyword" />
<token type="Text" />
</bygroups>
</rule>
<rule pattern="(^REGION)([ \t]*$)">
<bygroups>
<token type="Keyword" />
<token type="Text" />
</bygroups>
<push state="region-settings-list" />
</rule>
<rule
pattern="(^STYLE)([ \t]*$)((?:(?!&#45;&#45;&gt;)[\s\S])*?)((?:\r?\n){2})">
<bygroups>
<token type="Keyword" />
<token type="Text" />
<using lexer="CSS" />
<token type="Text" />
</bygroups>
</rule>
<rule>
<include state="comment" />
</rule>
<rule
pattern="(?=((?![^\r\n]*&#45;&#45;&gt;)[^\r\n]*\r?\n)?(\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3}[ \t]+&#45;&#45;&gt;[ \t]+(\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})"
>
<push state="cues" />
</rule>
</state>
<!-- https://www.w3.org/TR/webvtt1/#webvtt-region-settings-list -->
<state name="region-settings-list">
<rule pattern="(?: |\t|\r?\n(?!\r?\n))+">
<token type="Text" />
</rule>
<rule pattern="(?:\r?\n){2}">
<token type="Text" />
<pop depth="1" />
</rule>
<rule pattern="(id)(:)(?!&#45;&#45;&gt;)(\S+)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
</bygroups>
</rule>
<rule pattern="(width)(:)((?:[1-9]?\d|100)(?:\.\d+)?)(%)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
</bygroups>
</rule>
<rule pattern="(lines)(:)(\d+)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
</bygroups>
</rule>
<rule
pattern="(regionanchor|viewportanchor)(:)((?:[1-9]?\d|100)(?:\.\d+)?)(%)(,)((?:[1-9]?\d|100)(?:\.\d+)?)(%)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
</bygroups>
</rule>
<rule pattern="(scroll)(:)(up)">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
</state>
<!-- https://www.w3.org/TR/webvtt1/#webvtt-comment-block -->
<state name="comment">
<rule
pattern="^NOTE( |\t|\r?\n)((?!&#45;&#45;&gt;)[\s\S])*?(?:(\r?\n){2}|\Z)">
<token type="Comment" />
</rule>
</state>
<!--
"Zero or more WebVTT cue blocks and WebVTT comment blocks separated from each other by one or more
WebVTT line terminators." (https://www.w3.org/TR/webvtt1/#file-structure)
-->
<state name="cues">
<rule
pattern="(?:((?!&#45;&#45;&gt;)[^\r\n]+)?(\r?\n))?((?:\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})([ \t]+)(&#45;&#45;&gt;)([ \t]+)((?:\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})([ \t]*)">
<bygroups>
<token type="Name" />
<token type="Text" />
<token type="LiteralDate" />
<token type="Text" />
<token type="Operator" />
<token type="Text" />
<token type="LiteralDate" />
<token type="Text" />
</bygroups>
<push state="cue-settings-list" />
</rule>
<rule>
<include state="comment" />
</rule>
</state>
<!-- https://www.w3.org/TR/webvtt1/#webvtt-cue-settings-list -->
<state name="cue-settings-list">
<rule pattern="[ \t]+">
<token type="Text" />
</rule>
<rule pattern="(vertical)(:)?(rl|lr)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
<rule
pattern="(line)(:)?(?:(?:((?:[1-9]?\d|100)(?:\.\d+)?)(%)|(-?\d+))(?:(,)(start|center|end))?)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
<token type="Literal" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
<rule
pattern="(position)(:)?(?:(?:((?:[1-9]?\d|100)(?:\.\d+)?)(%)|(-?\d+))(?:(,)(line-left|center|line-right))?)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
<token type="Literal" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
<rule pattern="(size)(:)?(?:((?:[1-9]?\d|100)(?:\.\d+)?)(%))?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
<token type="KeywordType" />
</bygroups>
</rule>
<rule pattern="(align)(:)?(start|center|end|left|right)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="KeywordConstant" />
</bygroups>
</rule>
<rule pattern="(region)(:)?((?![^\r\n]*&#45;&#45;&gt;(?=[ \t]+?))[^ \t\r\n]+)?">
<bygroups>
<token type="Keyword" />
<token type="Punctuation" />
<token type="Literal" />
</bygroups>
</rule>
<rule
pattern="(?=\r?\n)">
<push state="cue-payload" />
</rule>
</state>
<!-- https://www.w3.org/TR/webvtt1/#cue-payload -->
<state name="cue-payload">
<rule pattern="(\r?\n){2,}">
<token type="Text" />
<pop depth="2" />
</rule>
<rule pattern="[^&lt;&amp;]+?">
<token type="Text" />
</rule>
<rule pattern="&amp;(#\d+|#x[0-9A-Fa-f]+|[a-zA-Z0-9]+);">
<token type="Text" />
</rule>
<rule pattern="(?=&lt;)">
<token type="Text" />
<push state="cue-span-tag" />
</rule>
</state>
<state name="cue-span-tag">
<rule
pattern="&lt;(?=c|i|b|u|ruby|rt|v|lang|(?:\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})">
<token type="Punctuation" />
<push state="cue-span-start-tag-name" />
</rule>
<rule pattern="(&lt;/)(c|i|b|u|ruby|rt|v|lang)">
<bygroups>
<token type="Punctuation" />
<token type="NameTag" />
</bygroups>
</rule>
<rule pattern="&gt;">
<token type="Punctuation" />
<pop depth="1" />
</rule>
</state>
<state name="cue-span-start-tag-name">
<rule pattern="(c|i|b|u|ruby|rt)|((?:\d{2}:)?(?:[0-5][0-9]):(?:[0-5][0-9])\.\d{3})">
<bygroups>
<token type="NameTag" />
<token type="LiteralDate" />
</bygroups>
<push state="cue-span-classes-without-annotations" />
</rule>
<rule pattern="v|lang">
<token type="NameTag" />
<push state="cue-span-classes-with-annotations" />
</rule>
</state>
<state name="cue-span-classes-without-annotations">
<rule>
<include state="cue-span-classes" />
</rule>
<rule pattern="(?=&gt;)">
<pop depth="2" />
</rule>
</state>
<state name="cue-span-classes-with-annotations">
<rule>
<include state="cue-span-classes" />
</rule>
<rule pattern="(?=[ \t])">
<push state="cue-span-start-tag-annotations" />
</rule>
</state>
<state name="cue-span-classes">
<rule pattern="(\.)([^ \t\n\r&amp;&lt;&gt;\.]+)">
<bygroups>
<token type="Punctuation" />
<token type="NameTag" />
</bygroups>
</rule>
</state>
<state name="cue-span-start-tag-annotations">
<rule
pattern="[ \t](?:[^\n\r&amp;&gt;]|&amp;(?:#\d+|#x[0-9A-Fa-f]+|[a-zA-Z0-9]+);)+">
<token type="Text" />
</rule>
<rule pattern="(?=&gt;)">
<token type="Text" />
<pop depth="3" />
</rule>
</state>
</rules>
</lexer>

View File

@ -53,7 +53,7 @@
<bygroups> <bygroups>
<token type="Punctuation"/> <token type="Punctuation"/>
<token type="LiteralStringDoc"/> <token type="LiteralStringDoc"/>
<token type="TextWhitespace"/> <token type="Ignore"/>
</bygroups> </bygroups>
</rule> </rule>
<rule pattern="(false|False|FALSE|true|True|TRUE|null|Off|off|yes|Yes|YES|OFF|On|ON|no|No|on|NO|n|N|Y|y)\b"> <rule pattern="(false|False|FALSE|true|True|TRUE|null|Off|off|yes|Yes|YES|OFF|On|ON|no|No|on|NO|n|N|Y|y)\b">

View File

@ -38,6 +38,12 @@ for fname in glob.glob("lexers/*.xml"):
lexer_by_filename[filename].add(lexer_name) lexer_by_filename[filename].add(lexer_name)
with open("src/constants/lexers.cr", "w") as f: with open("src/constants/lexers.cr", "w") as f:
# Crystal doesn't come from a xml file
lexer_by_name["crystal"] = "crystal"
lexer_by_name["cr"] = "crystal"
lexer_by_filename["*.cr"] = ["crystal"]
lexer_by_mimetype["text/x-crystal"] = "crystal"
f.write("module Tartrazine\n") f.write("module Tartrazine\n")
f.write(" LEXERS_BY_NAME = {\n") f.write(" LEXERS_BY_NAME = {\n")
for k in sorted(lexer_by_name.keys()): for k in sorted(lexer_by_name.keys()):

View File

@ -1,5 +1,5 @@
name: tartrazine name: tartrazine
version: 0.8.0 version: 0.12.0
authors: authors:
- Roberto Alsina <roberto.alsina@gmail.com> - Roberto Alsina <roberto.alsina@gmail.com>

View File

@ -1,4 +1,5 @@
require "./spec_helper" require "./spec_helper"
require "digest/sha1"
# These are the testcases from Pygments # These are the testcases from Pygments
testcases = Dir.glob("#{__DIR__}/tests/**/*txt").sort testcases = Dir.glob("#{__DIR__}/tests/**/*txt").sort
@ -46,8 +47,13 @@ known_bad = {
"#{__DIR__}/tests/bash_session/test_newline_in_ls_no_ps2.txt", "#{__DIR__}/tests/bash_session/test_newline_in_ls_no_ps2.txt",
"#{__DIR__}/tests/bash_session/test_newline_in_ls_ps2.txt", "#{__DIR__}/tests/bash_session/test_newline_in_ls_ps2.txt",
"#{__DIR__}/tests/bash_session/test_virtualenv.txt", "#{__DIR__}/tests/bash_session/test_virtualenv.txt",
"#{__DIR__}/tests/mcfunction/commenting.txt",
"#{__DIR__}/tests/mcfunction/coordinates.txt",
"#{__DIR__}/tests/mcfunction/data.txt", "#{__DIR__}/tests/mcfunction/data.txt",
"#{__DIR__}/tests/mcfunction/difficult_1.txt",
"#{__DIR__}/tests/mcfunction/multiline.txt",
"#{__DIR__}/tests/mcfunction/selectors.txt", "#{__DIR__}/tests/mcfunction/selectors.txt",
"#{__DIR__}/tests/mcfunction/simple.txt",
} }
# Tests that fail because of a limitation in PCRE2 # Tests that fail because of a limitation in PCRE2
@ -103,6 +109,7 @@ describe Tartrazine do
) )
end end
end end
describe "to_ansi" do describe "to_ansi" do
it "should do basic highlighting" do it "should do basic highlighting" do
ansi = Tartrazine.to_ansi("puts 'Hello, World!'", "ruby") ansi = Tartrazine.to_ansi("puts 'Hello, World!'", "ruby")
@ -114,11 +121,29 @@ describe Tartrazine do
) )
else else
ansi.should eq( ansi.should eq(
"\e[38;2;171;70;66mputs\e[0m\e[38;2;216;216;216m \e[0m'Hello, World!'" "\e[38;2;171;70;66mputs\e[0m\e[38;2;216;216;216m \e[0m\e[38;2;161;181;108m'Hello, World!'\e[0m"
) )
end end
end end
end end
describe "to_svg" do
it "should do basic highlighting" do
svg = Tartrazine.to_svg("puts 'Hello, World!'", "ruby", standalone: false)
svg.should eq(
"<text x=\"0\" y=\"19\" xml:space=\"preserve\"><tspan fill=\"#ab4642\">puts</tspan><tspan fill=\"#d8d8d8\"> </tspan><tspan fill=\"#a1b56c\">&#39;Hello, World!&#39;</tspan></text>"
)
end
end
describe "to_png" do
it "should do basic highlighting" do
png = Digest::SHA1.hexdigest(Tartrazine.to_png("puts 'Hello, World!'", "ruby"))
png.should eq(
"62d419dcd263fffffc265a0f04c156dc2530c362"
)
end
end
end end
# Helper that creates lexer and tokenizes # Helper that creates lexer and tokenizes

View File

@ -471,7 +471,7 @@ module Tartrazine
"application/x-fennel" => "fennel", "application/x-fennel" => "fennel",
"application/x-fish" => "fish", "application/x-fish" => "fish",
"application/x-forth" => "forth", "application/x-forth" => "forth",
"application/x-gdscript" => "gdscript", "application/x-gdscript" => "gdscript3",
"application/x-hcl" => "hcl", "application/x-hcl" => "hcl",
"application/x-hy" => "hy", "application/x-hy" => "hy",
"application/x-javascript" => "javascript", "application/x-javascript" => "javascript",
@ -594,7 +594,7 @@ module Tartrazine
"text/x-fortran" => "fortran", "text/x-fortran" => "fortran",
"text/x-fsharp" => "fsharp", "text/x-fsharp" => "fsharp",
"text/x-gas" => "gas", "text/x-gas" => "gas",
"text/x-gdscript" => "gdscript", "text/x-gdscript" => "gdscript3",
"text/x-gherkin" => "gherkin", "text/x-gherkin" => "gherkin",
"text/x-gleam" => "gleam", "text/x-gleam" => "gleam",
"text/x-glslsrc" => "glsl", "text/x-glslsrc" => "glsl",

View File

@ -34,8 +34,6 @@ module Tartrazine
end end
def colorize(text : String, token : String) : String def colorize(text : String, token : String) : String
style = theme.styles.fetch(token, nil)
return text if style.nil?
if theme.styles.has_key?(token) if theme.styles.has_key?(token)
s = theme.styles[token] s = theme.styles[token]
else else

View File

@ -28,6 +28,13 @@ module Tartrazine
property? surrounding_pre : Bool = true property? surrounding_pre : Bool = true
property? wrap_long_lines : Bool = false property? wrap_long_lines : Bool = false
property weight_of_bold : Int32 = 600 property weight_of_bold : Int32 = 600
property template : String = <<-TEMPLATE
<!DOCTYPE html><html><head><style>
{{style_defs}}
</style></head><body>
{{body}}
</body></html>
TEMPLATE
property theme : Theme property theme : Theme
@ -42,7 +49,8 @@ module Tartrazine
@standalone : Bool = false, @standalone : Bool = false,
@surrounding_pre : Bool = true, @surrounding_pre : Bool = true,
@wrap_long_lines : Bool = false, @wrap_long_lines : Bool = false,
@weight_of_bold : Int32 = 600) @weight_of_bold : Int32 = 600,
@template : String = @template)
end end
def format(text : String, lexer : Lexer) : String def format(text : String, lexer : Lexer) : String
@ -61,11 +69,15 @@ module Tartrazine
# Wrap text into a full HTML document, including the CSS for the theme # Wrap text into a full HTML document, including the CSS for the theme
def wrap_standalone def wrap_standalone
output = String.build do |outp| output = String.build do |outp|
outp << "<!DOCTYPE html><html><head><style>" if @template.includes? "{{style_defs}}"
outp << style_defs outp << @template.split("{{style_defs}}")[0]
outp << "</style></head><body>" outp << style_defs
outp << @template.split("{{style_defs}}")[1].split("{{body}}")[0]
else
outp << @template.split("{{body}}")[0]
end
end end
{output.to_s, "</body></html>"} {output.to_s, @template.split("{{body}}")[1]}
end end
private def line_label(i : Int32) : String private def line_label(i : Int32) : String

View File

@ -1,16 +1,20 @@
require "../formatter" require "../formatter"
require "compress/gzip"
require "digest/sha1"
require "stumpy_png" require "stumpy_png"
require "stumpy_utils" require "stumpy_utils"
require "compress/gzip"
module Tartrazine module Tartrazine
def self.to_png(text : String, language : String, def self.to_png(text : String, language : String,
theme : String = "default-dark", theme : String = "default-dark",
line_numbers : Bool = false) : String line_numbers : Bool = false) : String
buf = IO::Memory.new
Tartrazine::Png.new( Tartrazine::Png.new(
theme: Tartrazine.theme(theme), theme: Tartrazine.theme(theme),
line_numbers: line_numbers line_numbers: line_numbers
).format(text, Tartrazine.lexer(name: language)) ).format(text, Tartrazine.lexer(name: language), buf)
buf.to_s
end end
class FontFiles class FontFiles

View File

@ -6,11 +6,21 @@ require "crystal/syntax_highlighter"
module Tartrazine module Tartrazine
class LexerFiles class LexerFiles
extend BakedFileSystem extend BakedFileSystem
bake_folder "../lexers", __DIR__
macro bake_selected_lexers
{% for lexer in env("TT_LEXERS").split "," %}
bake_file {{ lexer }}+".xml", {{ read_file "#{__DIR__}/../lexers/" + lexer + ".xml" }}
{% end %}
end
{% if flag?(:nolexers) %}
bake_selected_lexers
{% else %}
bake_folder "../lexers", __DIR__
{% end %}
end end
# Get the lexer object for a language name # Get the lexer object for a language name
# FIXME: support mimetypes
def self.lexer(name : String? = nil, filename : String? = nil, mimetype : String? = nil) : BaseLexer def self.lexer(name : String? = nil, filename : String? = nil, mimetype : String? = nil) : BaseLexer
return lexer_by_name(name) if name && name != "autodetect" return lexer_by_name(name) if name && name != "autodetect"
return lexer_by_filename(filename) if filename return lexer_by_filename(filename) if filename
@ -33,6 +43,8 @@ module Tartrazine
raise Exception.new("Unknown lexer: #{name}") if lexer_file_name.nil? raise Exception.new("Unknown lexer: #{name}") if lexer_file_name.nil?
RegexLexer.from_xml(LexerFiles.get("/#{lexer_file_name}.xml").gets_to_end) RegexLexer.from_xml(LexerFiles.get("/#{lexer_file_name}.xml").gets_to_end)
rescue ex : BakedFileSystem::NoSuchFileError
raise Exception.new("Unknown lexer: #{name}")
end end
private def self.lexer_by_filename(filename : String) : BaseLexer private def self.lexer_by_filename(filename : String) : BaseLexer
@ -84,7 +96,8 @@ module Tartrazine
# Return a list of all lexers # Return a list of all lexers
def self.lexers : Array(String) def self.lexers : Array(String)
LEXERS_BY_NAME.keys.sort! file_map = LexerFiles.files.map(&.path)
LEXERS_BY_NAME.keys.select { |k| file_map.includes?("/#{k}.xml") }.sort!
end end
# A token, the output of the tokenizer # A token, the output of the tokenizer
@ -337,6 +350,13 @@ module Tartrazine
class CustomCrystalHighlighter < Crystal::SyntaxHighlighter class CustomCrystalHighlighter < Crystal::SyntaxHighlighter
@tokens = [] of Token @tokens = [] of Token
def highlight(text)
super
rescue ex : Crystal::SyntaxException
# Fallback to Ruby highlighting
@tokens = Tartrazine.lexer("ruby").tokenizer(text).to_a
end
def render_delimiter(&block) def render_delimiter(&block)
@tokens << {type: "LiteralString", value: block.call.to_s} @tokens << {type: "LiteralString", value: block.call.to_s}
end end

View File

@ -10,8 +10,8 @@ Keep in mind that not all formatters support all features.
Usage: Usage:
tartrazine (-h, --help) tartrazine (-h, --help)
tartrazine FILE -f html [-t theme][--standalone][--line-numbers] tartrazine FILE -f html [-t theme][--standalone [--template file]]
[-l lexer][-o output] [--line-numbers][-l lexer][-o output]
tartrazine -f html -t theme --css tartrazine -f html -t theme --css
tartrazine FILE -f terminal [-t theme][-l lexer][--line-numbers] tartrazine FILE -f terminal [-t theme][-l lexer][--line-numbers]
[-o output] [-o output]
@ -35,6 +35,7 @@ Options:
all style information. If not given, it will generate just all style information. If not given, it will generate just
a HTML fragment ready to include in your own page. a HTML fragment ready to include in your own page.
--css Generate a CSS file for the theme called <theme>.css --css Generate a CSS file for the theme called <theme>.css
--template <file> Use a custom template for the HTML output [default: none]
--line-numbers Include line numbers in the output --line-numbers Include line numbers in the output
-h, --help Show this screen -h, --help Show this screen
-v, --version Show version number -v, --version Show version number
@ -64,6 +65,12 @@ if options["--list-formatters"]
end end
theme = Tartrazine.theme(options["-t"].as(String)) theme = Tartrazine.theme(options["-t"].as(String))
template = options["--template"].as(String)
if template != "none" # Otherwise we will use the default template
template = File.open(template).gets_to_end
else
template = nil
end
if options["-f"] if options["-f"]
formatter = options["-f"].as(String) formatter = options["-f"].as(String)
@ -73,6 +80,7 @@ if options["-f"]
formatter.standalone = options["--standalone"] != nil formatter.standalone = options["--standalone"] != nil
formatter.line_numbers = options["--line-numbers"] != nil formatter.line_numbers = options["--line-numbers"] != nil
formatter.theme = theme formatter.theme = theme
formatter.template = template if template
when "terminal" when "terminal"
formatter = Tartrazine::Ansi.new formatter = Tartrazine::Ansi.new
formatter.line_numbers = options["--line-numbers"] != nil formatter.line_numbers = options["--line-numbers"] != nil

View File

@ -11,7 +11,20 @@ module Tartrazine
struct ThemeFiles struct ThemeFiles
extend BakedFileSystem extend BakedFileSystem
bake_folder "../styles", __DIR__
macro bake_selected_themes
{% if env("TT_THEMES") %}
{% for theme in env("TT_THEMES").split "," %}
bake_file {{ theme }}+".xml", {{ read_file "#{__DIR__}/../styles/" + theme + ".xml" }}
{% end %}
{% end %}
end
{% if flag?(:nothemes) %}
bake_selected_themes
{% else %}
bake_folder "../styles", __DIR__
{% end %}
end end
def self.theme(name : String) : Theme def self.theme(name : String) : Theme
@ -22,8 +35,8 @@ module Tartrazine
end end
begin begin
Theme.from_xml(ThemeFiles.get("/#{name}.xml").gets_to_end) Theme.from_xml(ThemeFiles.get("/#{name}.xml").gets_to_end)
rescue rescue ex : Exception
raise Exception.new("Theme #{name} not found") raise Exception.new("Error loading theme #{name}: #{ex.message}")
end end
end end

View File

@ -1,44 +1,39 @@
<style name="github"> <style name="github">
<entry type="Error" style="#a61717 bg:#e3d2d2"/> <entry type="Error" style="#f6f8fa bg:#82071e"/>
<entry type="Background" style="bg:#ffffff"/> <entry type="Background" style="bg:#ffffff"/>
<entry type="Keyword" style="bold #000000"/> <entry type="Keyword" style="#cf222e"/>
<entry type="KeywordType" style="bold #445588"/> <entry type="KeywordType" style="#cf222e"/>
<entry type="NameAttribute" style="#008080"/> <entry type="NameAttribute" style="#1f2328"/>
<entry type="NameBuiltin" style="#0086b3"/> <entry type="NameBuiltin" style="#6639ba"/>
<entry type="NameBuiltinPseudo" style="#999999"/> <entry type="NameBuiltinPseudo" style="#6a737d"/>
<entry type="NameClass" style="bold #445588"/> <entry type="NameClass" style="#1f2328"/>
<entry type="NameConstant" style="#008080"/> <entry type="NameConstant" style="#0550ae"/>
<entry type="NameDecorator" style="bold #3c5d5d"/> <entry type="NameDecorator" style="#0550ae"/>
<entry type="NameEntity" style="#800080"/> <entry type="NameEntity" style="#6639ba"/>
<entry type="NameException" style="bold #990000"/> <entry type="NameFunction" style="#6639ba"/>
<entry type="NameFunction" style="bold #990000"/>
<entry type="NameLabel" style="bold #990000"/> <entry type="NameLabel" style="bold #990000"/>
<entry type="NameNamespace" style="#555555"/> <entry type="NameNamespace" style="#24292e"/>
<entry type="NameTag" style="#000080"/> <entry type="NameOther" style="#1f2328"/>
<entry type="NameVariable" style="#008080"/> <entry type="NameTag" style="#0550ae"/>
<entry type="NameVariableClass" style="#008080"/> <entry type="NameVariable" style="#953800"/>
<entry type="NameVariableGlobal" style="#008080"/> <entry type="NameVariableClass" style="#953800"/>
<entry type="NameVariableInstance" style="#008080"/> <entry type="NameVariableGlobal" style="#953800"/>
<entry type="LiteralString" style="#dd1144"/> <entry type="NameVariableInstance" style="#953800"/>
<entry type="LiteralStringRegex" style="#009926"/> <entry type="LiteralString" style="#0a3069"/>
<entry type="LiteralStringSymbol" style="#990073"/> <entry type="LiteralStringRegex" style="#0a3069"/>
<entry type="LiteralNumber" style="#009999"/> <entry type="LiteralStringSymbol" style="#032f62"/>
<entry type="Operator" style="bold #000000"/> <entry type="LiteralNumber" style="#0550ae"/>
<entry type="Comment" style="italic #999988"/> <entry type="Operator" style="#0550ae"/>
<entry type="CommentMultiline" style="italic #999988"/> <entry type="Comment" style="#57606a"/>
<entry type="CommentSingle" style="italic #999988"/> <entry type="CommentMultiline" style="#57606a"/>
<entry type="CommentSpecial" style="bold italic #999999"/> <entry type="CommentSingle" style="#57606a"/>
<entry type="CommentPreproc" style="bold #999999"/> <entry type="CommentSpecial" style="#57606a"/>
<entry type="GenericDeleted" style="#000000 bg:#ffdddd"/> <entry type="CommentPreproc" style="#57606a"/>
<entry type="GenericEmph" style="italic #000000"/> <entry type="GenericDeleted" style="#82071e bg:#ffebe9"/>
<entry type="GenericError" style="#aa0000"/> <entry type="GenericEmph" style="#1f2328"/>
<entry type="GenericHeading" style="#999999"/> <entry type="GenericInserted" style="#116329 bg:#dafbe1"/>
<entry type="GenericInserted" style="#000000 bg:#ddffdd"/> <entry type="GenericOutput" style="#1f2328"/>
<entry type="GenericOutput" style="#888888"/>
<entry type="GenericPrompt" style="#555555"/>
<entry type="GenericStrong" style="bold"/>
<entry type="GenericSubheading" style="#aaaaaa"/>
<entry type="GenericTraceback" style="#aa0000"/>
<entry type="GenericUnderline" style="underline"/> <entry type="GenericUnderline" style="underline"/>
<entry type="TextWhitespace" style="#bbbbbb"/> <entry type="Punctuation" style="#1f2328"/>
<entry type="TextWhitespace" style="#ffffff"/>
</style> </style>

10
styles/onesenterprise.xml Normal file
View File

@ -0,0 +1,10 @@
<style name="onesenterprise">
<entry type="Keyword" style="#ff0000"/>
<entry type="Name" style="#0000ff"/>
<entry type="LiteralString" style="#000000"/>
<entry type="Operator" style="#ff0000"/>
<entry type="Punctuation" style="#ff0000"/>
<entry type="Comment" style="#008000"/>
<entry type="CommentPreproc" style="#963200"/>
<entry type="Text" style="#000000"/>
</style>

42
styles/pygments.xml Normal file
View File

@ -0,0 +1,42 @@
<style name="pygments">
<entry type="Error" style="border:#ff0000"/>
<entry type="Keyword" style="bold #008000"/>
<entry type="KeywordPseudo" style="nobold"/>
<entry type="KeywordType" style="nobold #b00040"/>
<entry type="NameAttribute" style="#7d9029"/>
<entry type="NameBuiltin" style="#008000"/>
<entry type="NameClass" style="bold #0000ff"/>
<entry type="NameConstant" style="#880000"/>
<entry type="NameDecorator" style="#aa22ff"/>
<entry type="NameEntity" style="bold #999999"/>
<entry type="NameException" style="bold #d2413a"/>
<entry type="NameFunction" style="#0000ff"/>
<entry type="NameLabel" style="#a0a000"/>
<entry type="NameNamespace" style="bold #0000ff"/>
<entry type="NameTag" style="bold #008000"/>
<entry type="NameVariable" style="#19177c"/>
<entry type="LiteralString" style="#ba2121"/>
<entry type="LiteralStringDoc" style="italic"/>
<entry type="LiteralStringEscape" style="bold #bb6622"/>
<entry type="LiteralStringInterpol" style="bold #bb6688"/>
<entry type="LiteralStringOther" style="#008000"/>
<entry type="LiteralStringRegex" style="#bb6688"/>
<entry type="LiteralStringSymbol" style="#19177c"/>
<entry type="LiteralNumber" style="#666666"/>
<entry type="Operator" style="#666666"/>
<entry type="OperatorWord" style="bold #aa22ff"/>
<entry type="Comment" style="italic #408080"/>
<entry type="CommentPreproc" style="noitalic #bc7a00"/>
<entry type="GenericDeleted" style="#a00000"/>
<entry type="GenericEmph" style="italic"/>
<entry type="GenericError" style="#ff0000"/>
<entry type="GenericHeading" style="bold #000080"/>
<entry type="GenericInserted" style="#00a000"/>
<entry type="GenericOutput" style="#888888"/>
<entry type="GenericPrompt" style="bold #000080"/>
<entry type="GenericStrong" style="bold"/>
<entry type="GenericSubheading" style="bold #800080"/>
<entry type="GenericTraceback" style="#0044dd"/>
<entry type="GenericUnderline" style="underline"/>
<entry type="TextWhitespace" style="#bbbbbb"/>
</style>