14 Commits

Author SHA1 Message Date
0e53b625b9 bump: Release v0.10.0 2024-09-26 20:51:35 -03:00
ecdb9d4523 bump: Release v0.10.0 2024-09-26 20:50:43 -03:00
e11775040c chore(build): strip static binary
Some checks failed
Tests / build (push) Has been cancelled
2024-09-26 20:50:14 -03:00
30bc8cccba chore: build before tag 2024-09-26 20:47:58 -03:00
1638c253cb bump: Release v0.10.0 2024-09-26 20:35:51 -03:00
c374f52aee Merge pull request #9 from ralsina/conditional-lexers-and-themes
Conditional lexers and themes
2024-09-26 20:34:24 -03:00
96fd9bdfe9 fix: Fix metadata to show crystal 2024-09-26 18:47:47 -03:00
0423811c5d feat: optional conditional baking of lexers 2024-09-26 18:47:47 -03:00
3d9d3ab5cf fix: strip binaries for release artifacts
Some checks failed
Tests / build (push) Has been cancelled
2024-09-21 21:28:13 -03:00
92a97490f1 bump: Release v0.9.1 2024-09-21 21:08:41 -03:00
22decedf3a test: added minimal tests for svg and png formatters
Some checks are pending
Tests / build (push) Waiting to run
2024-09-21 21:08:03 -03:00
8b34a1659d fix: Bug in high-level API for png formatter 2024-09-21 21:07:44 -03:00
3bf8172b89 fix: Terminal formatter was skipping things that it could highlight 2024-09-21 20:57:24 -03:00
4432da2893 bump: Release v0.9.0 2024-09-21 20:33:24 -03:00
11 changed files with 149 additions and 14 deletions

View File

@ -2,6 +2,81 @@
All notable changes to this project will be documented in this file.
## [0.10.0] - 2024-09-26
### 🚀 Features
- Optional conditional baking of lexers
### 🐛 Bug Fixes
- Strip binaries for release artifacts
- Fix metadata to show crystal
### ⚙️ Miscellaneous Tasks
- Build before tag
- *(build)* Strip static binary
### Bump
- Release v0.10.0
- Release v0.10.0
## [0.10.0] - 2024-09-26
### 🚀 Features
- Optional conditional baking of lexers
### 🐛 Bug Fixes
- Strip binaries for release artifacts
- Fix metadata to show crystal
### ⚙️ Miscellaneous Tasks
- Build before tag
- *(build)* Strip static binary
### Bump
- Release v0.10.0
## [0.10.0] - 2024-09-26
### 🚀 Features
- Optional conditional baking of lexers
### 🐛 Bug Fixes
- Strip binaries for release artifacts
- Fix metadata to show crystal
## [0.9.1] - 2024-09-22
### 🐛 Bug Fixes
- Terminal formatter was skipping things that it could highlight
- Bug in high-level API for png formatter
### 🧪 Testing
- Added minimal tests for svg and png formatters
## [0.9.0] - 2024-09-21
### 🚀 Features
- PNG writer based on Stumpy libs
### ⚙️ Miscellaneous Tasks
- Clean
- Detect version bump in release script
- Improve changelog handling
## [0.8.0] - 2024-09-21
### 🚀 Features

View File

@ -82,6 +82,25 @@ puts formatter.format("puts \"Hello, world!\"", lexer)
The reason you may want to use the manual version is to reuse
the lexer and formatter objects for performance reasons.
## Choosing what Lexers you want
By default Tartrazine will support all its lexers by embedding
them in the binary. This makes the binary large. If you are
using it as a library, you may want to just include a selection of lexers. To do that:
* Pass the `-Dnolexers` flag to the compiler
* Set the `TT_LEXERS` environment variable to a
comma-separated list of lexers you want to include.
This builds a binary with only the python, markdown, bash and yaml lexers (enough to highlight this `README.md`):
```bash
> TT_LEXERS=python,markdown,bash,yaml shards build -Dnolexers -d --error-trace
Dependencies are satisfied
Building: tartrazine
```
## Contributing
1. Fork it (<https://github.com/ralsina/tartrazine/fork>)

View File

@ -7,10 +7,10 @@ docker run --rm --privileged \
# Build for AMD64
docker build . -f Dockerfile.static -t tartrazine-builder
docker run -ti --rm -v "$PWD":/app --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release"
docker run -ti --rm -v "$PWD":/app --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release && strip bin/tartrazine"
mv bin/tartrazine bin/tartrazine-static-linux-amd64
# Build for ARM64
docker build . -f Dockerfile.static --platform linux/arm64 -t tartrazine-builder
docker run -ti --rm -v "$PWD":/app --platform linux/arm64 --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release"
docker run -ti --rm -v "$PWD":/app --platform linux/arm64 --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release && strip bin/tartrazine"
mv bin/tartrazine bin/tartrazine-static-linux-arm64

View File

@ -9,7 +9,7 @@ git add shard.yml
hace lint test
git cliff --bump -u -p CHANGELOG.md
git commit -a -m "bump: Release v$VERSION"
hace static
git tag "v$VERSION"
git push --tags
hace static
gh release create "v$VERSION" "bin/$PKGNAME-static-linux-amd64" "bin/$PKGNAME-static-linux-arm64" --title "Release v$VERSION" --notes "$(git cliff -l -s all)"

View File

@ -38,6 +38,12 @@ for fname in glob.glob("lexers/*.xml"):
lexer_by_filename[filename].add(lexer_name)
with open("src/constants/lexers.cr", "w") as f:
# Crystal doesn't come from a xml file
lexer_by_name["crystal"] = "crystal"
lexer_by_name["cr"] = "crystal"
lexer_by_filename["*.cr"] = ["crystal"]
lexer_by_mimetype["text/x-crystal"] = "crystal"
f.write("module Tartrazine\n")
f.write(" LEXERS_BY_NAME = {\n")
for k in sorted(lexer_by_name.keys()):

View File

@ -1,5 +1,5 @@
name: tartrazine
version: 0.8.0
version: 0.10.0
authors:
- Roberto Alsina <roberto.alsina@gmail.com>

View File

@ -1,4 +1,5 @@
require "./spec_helper"
require "digest/sha1"
# These are the testcases from Pygments
testcases = Dir.glob("#{__DIR__}/tests/**/*txt").sort
@ -103,6 +104,7 @@ describe Tartrazine do
)
end
end
describe "to_ansi" do
it "should do basic highlighting" do
ansi = Tartrazine.to_ansi("puts 'Hello, World!'", "ruby")
@ -114,11 +116,29 @@ describe Tartrazine do
)
else
ansi.should eq(
"\e[38;2;171;70;66mputs\e[0m\e[38;2;216;216;216m \e[0m'Hello, World!'"
"\e[38;2;171;70;66mputs\e[0m\e[38;2;216;216;216m \e[0m\e[38;2;161;181;108m'Hello, World!'\e[0m"
)
end
end
end
describe "to_svg" do
it "should do basic highlighting" do
svg = Tartrazine.to_svg("puts 'Hello, World!'", "ruby", standalone: false)
svg.should eq(
"<text x=\"0\" y=\"19\" xml:space=\"preserve\"><tspan fill=\"#ab4642\">puts</tspan><tspan fill=\"#d8d8d8\"> </tspan><tspan fill=\"#a1b56c\">&#39;Hello, World!&#39;</tspan></text>"
)
end
end
describe "to_png" do
it "should do basic highlighting" do
png = Digest::SHA1.hexdigest(Tartrazine.to_png("puts 'Hello, World!'", "ruby"))
png.should eq(
"62d419dcd263fffffc265a0f04c156dc2530c362"
)
end
end
end
# Helper that creates lexer and tokenizes

View File

@ -471,7 +471,7 @@ module Tartrazine
"application/x-fennel" => "fennel",
"application/x-fish" => "fish",
"application/x-forth" => "forth",
"application/x-gdscript" => "gdscript",
"application/x-gdscript" => "gdscript3",
"application/x-hcl" => "hcl",
"application/x-hy" => "hy",
"application/x-javascript" => "javascript",
@ -594,7 +594,7 @@ module Tartrazine
"text/x-fortran" => "fortran",
"text/x-fsharp" => "fsharp",
"text/x-gas" => "gas",
"text/x-gdscript" => "gdscript",
"text/x-gdscript" => "gdscript3",
"text/x-gherkin" => "gherkin",
"text/x-gleam" => "gleam",
"text/x-glslsrc" => "glsl",

View File

@ -34,8 +34,6 @@ module Tartrazine
end
def colorize(text : String, token : String) : String
style = theme.styles.fetch(token, nil)
return text if style.nil?
if theme.styles.has_key?(token)
s = theme.styles[token]
else

View File

@ -1,16 +1,20 @@
require "../formatter"
require "compress/gzip"
require "digest/sha1"
require "stumpy_png"
require "stumpy_utils"
require "compress/gzip"
module Tartrazine
def self.to_png(text : String, language : String,
theme : String = "default-dark",
line_numbers : Bool = false) : String
buf = IO::Memory.new
Tartrazine::Png.new(
theme: Tartrazine.theme(theme),
line_numbers: line_numbers
).format(text, Tartrazine.lexer(name: language))
).format(text, Tartrazine.lexer(name: language), buf)
buf.to_s
end
class FontFiles

View File

@ -6,11 +6,21 @@ require "crystal/syntax_highlighter"
module Tartrazine
class LexerFiles
extend BakedFileSystem
bake_folder "../lexers", __DIR__
macro bake_selected_lexers
{% for lexer in env("TT_LEXERS").split "," %}
bake_file {{ lexer }}+".xml", {{ read_file "lexers/" + lexer + ".xml" }}
{% end %}
end
{% if flag?(:nolexers) %}
bake_selected_lexers
{% else %}
bake_folder "../lexers", __DIR__
{% end %}
end
# Get the lexer object for a language name
# FIXME: support mimetypes
def self.lexer(name : String? = nil, filename : String? = nil, mimetype : String? = nil) : BaseLexer
return lexer_by_name(name) if name && name != "autodetect"
return lexer_by_filename(filename) if filename
@ -33,6 +43,8 @@ module Tartrazine
raise Exception.new("Unknown lexer: #{name}") if lexer_file_name.nil?
RegexLexer.from_xml(LexerFiles.get("/#{lexer_file_name}.xml").gets_to_end)
rescue ex : BakedFileSystem::NoSuchFileError
raise Exception.new("Unknown lexer: #{name}")
end
private def self.lexer_by_filename(filename : String) : BaseLexer
@ -84,7 +96,8 @@ module Tartrazine
# Return a list of all lexers
def self.lexers : Array(String)
LEXERS_BY_NAME.keys.sort!
file_map = LexerFiles.files.map(&.path)
LEXERS_BY_NAME.keys.select { |k| file_map.includes?("/#{k}.xml") }.sort!
end
# A token, the output of the tokenizer