1 Commits

Author SHA1 Message Date
f2d802e391 bump: Release v0.7.0 2024-09-09 22:00:54 -03:00
28 changed files with 131 additions and 517 deletions

View File

@ -1,9 +1,9 @@
# This configuration file was generated by `ameba --gen-config`
# on 2024-09-21 14:59:30 UTC using Ameba version 1.6.1.
# on 2024-08-12 22:00:49 UTC using Ameba version 1.6.1.
# The point is for the user to remove these configuration records
# one by one as the reported problems are removed from the code base.
# Problems found: 3
# Problems found: 2
# Run `ameba --only Documentation/DocumentationAdmonition` for details
Documentation/DocumentationAdmonition:
Description: Reports documentation admonitions
@ -18,17 +18,104 @@ Documentation/DocumentationAdmonition:
Enabled: true
Severity: Warning
# Problems found: 1
# Run `ameba --only Lint/SpecFilename` for details
Lint/SpecFilename:
Description: Enforces spec filenames to have `_spec` suffix
# Problems found: 22
# Run `ameba --only Lint/MissingBlockArgument` for details
Lint/MissingBlockArgument:
Description: Disallows yielding method definitions without block argument
Excluded:
- spec/examples/crystal/hello.cr
IgnoredDirs:
- spec/support
- spec/fixtures
- spec/data
IgnoredFilenames:
- spec_helper
- pygments/tests/examplefiles/cr/test.cr
Enabled: true
Severity: Warning
# Problems found: 1
# Run `ameba --only Lint/NotNil` for details
Lint/NotNil:
Description: Identifies usage of `not_nil!` calls
Excluded:
- pygments/tests/examplefiles/cr/test.cr
Enabled: true
Severity: Warning
# Problems found: 34
# Run `ameba --only Lint/ShadowingOuterLocalVar` for details
Lint/ShadowingOuterLocalVar:
Description: Disallows the usage of the same name as outer local variables for block
or proc arguments
Excluded:
- pygments/tests/examplefiles/cr/test.cr
Enabled: true
Severity: Warning
# Problems found: 1
# Run `ameba --only Lint/UnreachableCode` for details
Lint/UnreachableCode:
Description: Reports unreachable code
Excluded:
- pygments/tests/examplefiles/cr/test.cr
Enabled: true
Severity: Warning
# Problems found: 6
# Run `ameba --only Lint/UselessAssign` for details
Lint/UselessAssign:
Description: Disallows useless variable assignments
ExcludeTypeDeclarations: false
Excluded:
- pygments/tests/examplefiles/cr/test.cr
Enabled: true
Severity: Warning
# Problems found: 3
# Run `ameba --only Naming/BlockParameterName` for details
Naming/BlockParameterName:
Description: Disallows non-descriptive block parameter names
MinNameLength: 3
AllowNamesEndingInNumbers: true
Excluded:
- pygments/tests/examplefiles/cr/test.cr
AllowedNames:
- _
- e
- i
- j
- k
- v
- x
- y
- ex
- io
- ws
- op
- tx
- id
- ip
- k1
- k2
- v1
- v2
ForbiddenNames: []
Enabled: true
Severity: Convention
# Problems found: 1
# Run `ameba --only Naming/RescuedExceptionsVariableName` for details
Naming/RescuedExceptionsVariableName:
Description: Makes sure that rescued exceptions variables are named as expected
Excluded:
- pygments/tests/examplefiles/cr/test.cr
AllowedNames:
- e
- ex
- exception
- error
Enabled: true
Severity: Convention
# Problems found: 6
# Run `ameba --only Naming/TypeNames` for details
Naming/TypeNames:
Description: Enforces type names in camelcase manner
Excluded:
- pygments/tests/examplefiles/cr/test.cr
Enabled: true
Severity: Convention

View File

@ -2,104 +2,6 @@
All notable changes to this project will be documented in this file.
## [0.10.0] - 2024-09-26
### 🚀 Features
- Optional conditional baking of lexers
### 🐛 Bug Fixes
- Strip binaries for release artifacts
- Fix metadata to show crystal
### ⚙️ Miscellaneous Tasks
- Build before tag
- *(build)* Strip static binary
### Bump
- Release v0.10.0
- Release v0.10.0
## [0.10.0] - 2024-09-26
### 🚀 Features
- Optional conditional baking of lexers
### 🐛 Bug Fixes
- Strip binaries for release artifacts
- Fix metadata to show crystal
### ⚙️ Miscellaneous Tasks
- Build before tag
- *(build)* Strip static binary
### Bump
- Release v0.10.0
## [0.10.0] - 2024-09-26
### 🚀 Features
- Optional conditional baking of lexers
### 🐛 Bug Fixes
- Strip binaries for release artifacts
- Fix metadata to show crystal
## [0.9.1] - 2024-09-22
### 🐛 Bug Fixes
- Terminal formatter was skipping things that it could highlight
- Bug in high-level API for png formatter
### 🧪 Testing
- Added minimal tests for svg and png formatters
## [0.9.0] - 2024-09-21
### 🚀 Features
- PNG writer based on Stumpy libs
### ⚙️ Miscellaneous Tasks
- Clean
- Detect version bump in release script
- Improve changelog handling
## [0.8.0] - 2024-09-21
### 🚀 Features
- SVG formatter
### 🐛 Bug Fixes
- HTML formatter was setting bold wrong
### 📚 Documentation
- Added instructions to add as a dependency
### 🧪 Testing
- Add basic tests for crystal and delegating lexers
- Added tests for CSS generation
### ⚙ Miscellaneous Tasks
- Fix example code in README
## [0.7.0] - 2024-09-10
### 🚀 Features

View File

@ -113,11 +113,3 @@ tasks:
kcov --clean --include-path=./src ${PWD}/coverage ./bin/run_tests
outputs:
- coverage/index.html
loc:
phony: true
always_run: true
dependencies:
- src
commands: |
tokei src -e src/constants/

View File

@ -45,14 +45,6 @@ $ tartrazine whatever.c -t catppuccin-macchiato --line-numbers \
## Usage as a Library
Add to your `shard.yml`:
```yaml
dependencies:
tartrazine:
github: ralsina/tartrazine
```
This is the high level API:
```crystal
@ -71,7 +63,7 @@ This does more or less the same thing, but more manually:
```crystal
lexer = Tartrazine.lexer("crystal")
formatter = Tartrazine::Html.new(
formatter = Tartrazine::Html.new (
theme: Tartrazine.theme("catppuccin-macchiato"),
line_numbers: true,
standalone: true,
@ -82,25 +74,6 @@ puts formatter.format("puts \"Hello, world!\"", lexer)
The reason you may want to use the manual version is to reuse
the lexer and formatter objects for performance reasons.
## Choosing what Lexers you want
By default Tartrazine will support all its lexers by embedding
them in the binary. This makes the binary large. If you are
using it as a library, you may want to just include a selection of lexers. To do that:
* Pass the `-Dnolexers` flag to the compiler
* Set the `TT_LEXERS` environment variable to a
comma-separated list of lexers you want to include.
This builds a binary with only the python, markdown, bash and yaml lexers (enough to highlight this `README.md`):
```bash
> TT_LEXERS=python,markdown,bash,yaml shards build -Dnolexers -d --error-trace
Dependencies are satisfied
Building: tartrazine
```
## Contributing
1. Fork it (<https://github.com/ralsina/tartrazine/fork>)

View File

@ -7,10 +7,10 @@ docker run --rm --privileged \
# Build for AMD64
docker build . -f Dockerfile.static -t tartrazine-builder
docker run -ti --rm -v "$PWD":/app --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release && strip bin/tartrazine"
docker run -ti --rm -v "$PWD":/app --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release"
mv bin/tartrazine bin/tartrazine-static-linux-amd64
# Build for ARM64
docker build . -f Dockerfile.static --platform linux/arm64 -t tartrazine-builder
docker run -ti --rm -v "$PWD":/app --platform linux/arm64 --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release && strip bin/tartrazine"
docker run -ti --rm -v "$PWD":/app --platform linux/arm64 --user="$UID" tartrazine-builder /bin/sh -c "cd /app && rm -rf lib shard.lock && shards build --static --release"
mv bin/tartrazine bin/tartrazine-static-linux-arm64

View File

@ -2,14 +2,14 @@
set e
PKGNAME=$(basename "$PWD")
VERSION=$(git cliff --bumped-version --unreleased |cut -dv -f2)
VERSION=$(git cliff --bumped-version |cut -dv -f2)
sed "s/^version:.*$/version: $VERSION/g" -i shard.yml
git add shard.yml
hace lint test
git cliff --bump -u -p CHANGELOG.md
git cliff --bump -o
git commit -a -m "bump: Release v$VERSION"
hace static
git tag "v$VERSION"
git push --tags
hace static
gh release create "v$VERSION" "bin/$PKGNAME-static-linux-amd64" "bin/$PKGNAME-static-linux-arm64" --title "Release v$VERSION" --notes "$(git cliff -l -s all)"

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -38,12 +38,6 @@ for fname in glob.glob("lexers/*.xml"):
lexer_by_filename[filename].add(lexer_name)
with open("src/constants/lexers.cr", "w") as f:
# Crystal doesn't come from a xml file
lexer_by_name["crystal"] = "crystal"
lexer_by_name["cr"] = "crystal"
lexer_by_filename["*.cr"] = ["crystal"]
lexer_by_mimetype["text/x-crystal"] = "crystal"
f.write("module Tartrazine\n")
f.write(" LEXERS_BY_NAME = {\n")
for k in sorted(lexer_by_name.keys()):

View File

@ -1,5 +1,5 @@
name: tartrazine
version: 0.10.0
version: 0.7.0
authors:
- Roberto Alsina <roberto.alsina@gmail.com>
@ -18,10 +18,6 @@ dependencies:
github: ralsina/sixteen
docopt:
github: chenkovsky/docopt.cr
stumpy_utils:
github: stumpycr/stumpy_utils
stumpy_png:
github: stumpycr/stumpy_png
crystal: ">= 1.13.0"

View File

@ -1 +0,0 @@
.e {color: #aa0000;background-color: #ffaaaa;}.b {background-color: #f0f3f3;tab-size: 8;}.k {color: #006699;font-weight: 600;}.kp {}.kt {color: #007788;}.na {color: #330099;}.nb {color: #336666;}.nc {color: #00aa88;font-weight: 600;}.nc {color: #336600;}.nd {color: #9999ff;}.ne {color: #999999;font-weight: 600;}.ne {color: #cc0000;font-weight: 600;}.nf {color: #cc00ff;}.nl {color: #9999ff;}.nn {color: #00ccff;font-weight: 600;}.nt {color: #330099;font-weight: 600;}.nv {color: #003333;}.ls {color: #cc3300;}.lsd {font-style: italic;}.lse {color: #cc3300;font-weight: 600;}.lsi {color: #aa0000;}.lso {color: #cc3300;}.lsr {color: #33aaaa;}.lss {color: #ffcc33;}.ln {color: #ff6600;}.o {color: #555555;}.ow {color: #000000;font-weight: 600;}.c {color: #0099ff;font-style: italic;}.cs {font-weight: 600;}.cp {color: #009999;font-style: normal;}.gd {background-color: #ffcccc;border: 1px solid #cc0000;}.ge {font-style: italic;}.ge {color: #ff0000;}.gh {color: #003300;font-weight: 600;}.gi {background-color: #ccffcc;border: 1px solid #00cc00;}.go {color: #aaaaaa;}.gp {color: #000099;font-weight: 600;}.gs {font-weight: 600;}.gs {color: #003300;font-weight: 600;}.gt {color: #99cc66;}.gu {text-decoration: underline;}.tw {color: #bbbbbb;}.lh {}

View File

@ -1 +0,0 @@
.b {color: #b7b7b7;background-color: #101010;font-weight: 600;tab-size: 8;}.lh {color: #8eaaaa;background-color: #232323;}.t {color: #b7b7b7;}.e {color: #de6e6e;}.c {color: #333333;}.cp {color: #876c4f;}.cpf {color: #5f8787;}.k {color: #d69094;}.kt {color: #de6e6e;}.na {color: #8eaaaa;}.nb {color: #de6e6e;}.nbp {color: #de6e6e;}.nc {color: #8eaaaa;}.nc {color: #dab083;}.nd {color: #dab083;}.nf {color: #8eaaaa;}.nn {color: #8eaaaa;}.nt {color: #d69094;}.nv {color: #8eaaaa;}.nvi {color: #de6e6e;}.ln {color: #dab083;}.o {color: #60a592;}.ow {color: #d69094;}.l {color: #5f8787;}.ls {color: #5f8787;}.lsi {color: #876c4f;}.lsr {color: #60a592;}.lss {color: #dab083;}

View File

@ -1 +0,0 @@
puts "Hello Crystal!"

View File

@ -1 +0,0 @@
[{"type":"Text","value":"puts "},{"type":"LiteralString","value":"\"Hello Crystal!\""},{"type":"Text","value":"\n"}]

View File

@ -1,11 +0,0 @@
from flask import Flask, request
app = Flask("{{name}}")
@app.route('/')
def handle():
return "Hello World from Flask!"
@app.route('/ping')
def ping():
return "OK"

View File

@ -1 +0,0 @@
[{"type":"KeywordNamespace","value":"from"},{"type":"Text","value":" "},{"type":"NameNamespace","value":"flask"},{"type":"Text","value":" "},{"type":"KeywordNamespace","value":"import"},{"type":"Text","value":" "},{"type":"Name","value":"Flask"},{"type":"Punctuation","value":","},{"type":"Text","value":" "},{"type":"Name","value":"request"},{"type":"Text","value":"\n\n"},{"type":"Name","value":"app"},{"type":"Text","value":" "},{"type":"Operator","value":"="},{"type":"Text","value":" "},{"type":"Name","value":"Flask"},{"type":"Punctuation","value":"("},{"type":"LiteralStringDouble","value":"\""},{"type":"CommentPreproc","value":"{{"},{"type":"NameVariable","value":"name"},{"type":"CommentPreproc","value":"}}"},{"type":"LiteralStringDouble","value":"\")"},{"type":"Text","value":"\n\n"},{"type":"NameDecorator","value":"@app.route"},{"type":"Punctuation","value":"("},{"type":"LiteralStringSingle","value":"'/'"},{"type":"Punctuation","value":")"},{"type":"Text","value":"\n"},{"type":"Keyword","value":"def"},{"type":"Text","value":" "},{"type":"NameFunction","value":"handle"},{"type":"Punctuation","value":"():"},{"type":"Text","value":"\n "},{"type":"Keyword","value":"return"},{"type":"Text","value":" "},{"type":"LiteralStringDouble","value":"\"Hello World from Flask!\""},{"type":"Text","value":"\n\n"},{"type":"NameDecorator","value":"@app.route"},{"type":"Punctuation","value":"("},{"type":"LiteralStringSingle","value":"'/ping'"},{"type":"Punctuation","value":")"},{"type":"Text","value":"\n"},{"type":"Keyword","value":"def"},{"type":"Text","value":" "},{"type":"NameFunction","value":"ping"},{"type":"Punctuation","value":"():"},{"type":"Text","value":"\n "},{"type":"Keyword","value":"return"},{"type":"Text","value":" "},{"type":"LiteralStringDouble","value":"\"OK\""},{"type":"Text","value":"\n"}]

View File

@ -1,15 +1,8 @@
require "./spec_helper"
require "digest/sha1"
# These are the testcases from Pygments
testcases = Dir.glob("#{__DIR__}/tests/**/*txt").sort
# These are custom testcases
examples = Dir.glob("#{__DIR__}/examples/**/*.*").reject(&.ends_with? ".json").sort!
# CSS Stylesheets
css_files = Dir.glob("#{__DIR__}/css/*.css")
# These lexers don't load because of parsing issues
failing_lexers = {
"webgpu_shading_language",
@ -58,14 +51,6 @@ not_my_fault = {
describe Tartrazine do
describe "Lexer" do
examples.each do |example|
it "parses #{example}".split("/")[-2...].join("/") do
lexer = Tartrazine.lexer(name: File.basename(File.dirname(example)).downcase)
text = File.read(example)
expected = Array(Tartrazine::Token).from_json(File.read("#{example}.json"))
Tartrazine::RegexLexer.collapse_tokens(lexer.tokenizer(text).to_a).should eq expected
end
end
testcases.each do |testcase|
if known_bad.includes?(testcase)
pending "parses #{testcase}".split("/")[-2...].join("/") do
@ -85,17 +70,6 @@ describe Tartrazine do
end
end
describe "formatter" do
css_files.each do |css_file|
it "generates #{css_file}" do
css = File.read(css_file)
theme = Tartrazine.theme(File.basename(css_file, ".css"))
formatter = Tartrazine::Html.new(theme: theme)
formatter.style_defs.strip.should eq css.strip
end
end
end
describe "to_html" do
it "should do basic highlighting" do
html = Tartrazine.to_html("puts 'Hello, World!'", "ruby", standalone: false)
@ -104,7 +78,6 @@ describe Tartrazine do
)
end
end
describe "to_ansi" do
it "should do basic highlighting" do
ansi = Tartrazine.to_ansi("puts 'Hello, World!'", "ruby")
@ -116,29 +89,11 @@ describe Tartrazine do
)
else
ansi.should eq(
"\e[38;2;171;70;66mputs\e[0m\e[38;2;216;216;216m \e[0m\e[38;2;161;181;108m'Hello, World!'\e[0m"
"\e[38;2;171;70;66mputs\e[0m\e[38;2;216;216;216m \e[0m'Hello, World!'"
)
end
end
end
describe "to_svg" do
it "should do basic highlighting" do
svg = Tartrazine.to_svg("puts 'Hello, World!'", "ruby", standalone: false)
svg.should eq(
"<text x=\"0\" y=\"19\" xml:space=\"preserve\"><tspan fill=\"#ab4642\">puts</tspan><tspan fill=\"#d8d8d8\"> </tspan><tspan fill=\"#a1b56c\">&#39;Hello, World!&#39;</tspan></text>"
)
end
end
describe "to_png" do
it "should do basic highlighting" do
png = Digest::SHA1.hexdigest(Tartrazine.to_png("puts 'Hello, World!'", "ruby"))
png.should eq(
"62d419dcd263fffffc265a0f04c156dc2530c362"
)
end
end
end
# Helper that creates lexer and tokenizes

View File

@ -471,7 +471,7 @@ module Tartrazine
"application/x-fennel" => "fennel",
"application/x-fish" => "fish",
"application/x-forth" => "forth",
"application/x-gdscript" => "gdscript3",
"application/x-gdscript" => "gdscript",
"application/x-hcl" => "hcl",
"application/x-hy" => "hy",
"application/x-javascript" => "javascript",
@ -594,7 +594,7 @@ module Tartrazine
"text/x-fortran" => "fortran",
"text/x-fsharp" => "fsharp",
"text/x-gas" => "gas",
"text/x-gdscript" => "gdscript3",
"text/x-gdscript" => "gdscript",
"text/x-gherkin" => "gherkin",
"text/x-gleam" => "gleam",
"text/x-glslsrc" => "glsl",

View File

@ -17,19 +17,12 @@ module Tartrazine
end
def format(text : String, lexer : Lexer) : String
outp = String::Builder.new("")
format(text, lexer, outp)
outp.to_s
raise Exception.new("Not implemented")
end
# Return the styles, if the formatter supports it.
def style_defs : String
raise Exception.new("Not implemented")
end
# Is this line in the highlighted ranges?
def highlighted?(line : Int) : Bool
highlight_lines.any?(&.includes?(line))
end
end
end

View File

@ -20,6 +20,12 @@ module Tartrazine
"#{i + 1}".rjust(4).ljust(5)
end
def format(text : String, lexer : BaseLexer) : String
outp = String::Builder.new("")
format(text, lexer, outp)
outp.to_s
end
def format(text : String, lexer : BaseLexer, outp : IO) : Nil
tokenizer = lexer.tokenizer(text)
i = 0
@ -34,6 +40,8 @@ module Tartrazine
end
def colorize(text : String, token : String) : String
style = theme.styles.fetch(token, nil)
return text if style.nil?
if theme.styles.has_key?(token)
s = theme.styles[token]
else

View File

@ -106,7 +106,8 @@ module Tartrazine
# These are true/false/nil
outp << "border: none;" if style.border == false
outp << "font-weight: #{@weight_of_bold};" if style.bold
outp << "font-weight: bold;" if style.bold
outp << "font-weight: #{@weight_of_bold};" if style.bold == false
outp << "font-style: italic;" if style.italic
outp << "font-style: normal;" if style.italic == false
outp << "text-decoration: underline;" if style.underline
@ -133,5 +134,10 @@ module Tartrazine
end
class_prefix + Abbreviations[token]
end
# Is this line in the highlighted ranges?
def highlighted?(line : Int) : Bool
highlight_lines.any?(&.includes?(line))
end
end
end

View File

@ -1,117 +0,0 @@
require "../formatter"
require "compress/gzip"
require "digest/sha1"
require "stumpy_png"
require "stumpy_utils"
module Tartrazine
def self.to_png(text : String, language : String,
theme : String = "default-dark",
line_numbers : Bool = false) : String
buf = IO::Memory.new
Tartrazine::Png.new(
theme: Tartrazine.theme(theme),
line_numbers: line_numbers
).format(text, Tartrazine.lexer(name: language), buf)
buf.to_s
end
class FontFiles
extend BakedFileSystem
bake_folder "../../fonts", __DIR__
end
class Png < Formatter
include StumpyPNG
property? line_numbers : Bool = false
@font_regular : PCFParser::Font
@font_bold : PCFParser::Font
@font_oblique : PCFParser::Font
@font_bold_oblique : PCFParser::Font
@font_width = 15
@font_height = 24
def initialize(@theme : Theme = Tartrazine.theme("default-dark"), @line_numbers : Bool = false)
@font_regular = load_font("/courier-regular.pcf.gz")
@font_bold = load_font("/courier-bold.pcf.gz")
@font_oblique = load_font("/courier-oblique.pcf.gz")
@font_bold_oblique = load_font("/courier-bold-oblique.pcf.gz")
end
private def load_font(name : String) : PCFParser::Font
compressed = FontFiles.get(name)
uncompressed = Compress::Gzip::Reader.open(compressed) do |gzip|
gzip.gets_to_end
end
PCFParser::Font.new(IO::Memory.new uncompressed)
end
private def line_label(i : Int32) : String
"#{i + 1}".rjust(4).ljust(5)
end
def format(text : String, lexer : BaseLexer, outp : IO) : Nil
# Create canvas of correct size
lines = text.split("\n")
canvas_height = lines.size * @font_height
canvas_width = lines.max_of(&.size)
canvas_width += 5 if line_numbers?
canvas_width *= @font_width
bg_color = RGBA.from_hex("##{theme.styles["Background"].background.try &.hex}")
canvas = Canvas.new(canvas_width, canvas_height, bg_color)
tokenizer = lexer.tokenizer(text)
x = 0
y = @font_height
i = 0
if line_numbers?
canvas.text(x, y, line_label(i), @font_regular, RGBA.from_hex("##{theme.styles["Background"].color.try &.hex}"))
x += 5 * @font_width
end
tokenizer.each do |token|
font, color = token_style(token[:type])
# These fonts are very limited
t = token[:value].gsub(/[^[:ascii:]]/, "?")
canvas.text(x, y, t.rstrip("\n"), font, color)
if token[:value].includes?("\n")
x = 0
y += @font_height
i += 1
if line_numbers?
canvas.text(x, y, line_label(i), @font_regular, RGBA.from_hex("##{theme.styles["Background"].color.try &.hex}"))
x += 4 * @font_width
end
end
x += token[:value].size * @font_width
end
StumpyPNG.write(canvas, outp)
end
def token_style(token : String) : {PCFParser::Font, RGBA}
if theme.styles.has_key?(token)
s = theme.styles[token]
else
# Themes don't contain information for each specific
# token type. However, they may contain information
# for a parent style. Worst case, we go to the root
# (Background) style.
s = theme.styles[theme.style_parents(token).reverse.find { |parent|
theme.styles.has_key?(parent)
}]
end
color = RGBA.from_hex("##{theme.styles["Background"].color.try &.hex}")
color = RGBA.from_hex("##{s.color.try &.hex}") if s.color
return {@font_bold_oblique, color} if s.bold && s.italic
return {@font_bold, color} if s.bold
return {@font_oblique, color} if s.italic
return {@font_regular, color}
end
end
end

View File

@ -1,129 +0,0 @@
require "../constants/token_abbrevs.cr"
require "../formatter"
require "html"
module Tartrazine
def self.to_svg(text : String, language : String,
theme : String = "default-dark",
standalone : Bool = true,
line_numbers : Bool = false) : String
Tartrazine::Svg.new(
theme: Tartrazine.theme(theme),
standalone: standalone,
line_numbers: line_numbers
).format(text, Tartrazine.lexer(name: language))
end
class Svg < Formatter
property highlight_lines : Array(Range(Int32, Int32)) = [] of Range(Int32, Int32)
property line_number_id_prefix : String = "line-"
property line_number_start : Int32 = 1
property tab_width = 8
property? line_numbers : Bool = false
property? linkable_line_numbers : Bool = true
property? standalone : Bool = false
property weight_of_bold : Int32 = 600
property fs : Int32
property ystep : Int32
property theme : Theme
def initialize(@theme : Theme = Tartrazine.theme("default-dark"), *,
@highlight_lines = [] of Range(Int32, Int32),
@class_prefix : String = "",
@line_number_id_prefix = "line-",
@line_number_start = 1,
@tab_width = 8,
@line_numbers : Bool = false,
@linkable_line_numbers : Bool = true,
@standalone : Bool = false,
@weight_of_bold : Int32 = 600,
@font_family : String = "monospace",
@font_size : String = "14px")
if font_size.ends_with? "px"
@fs = font_size[0...-2].to_i
else
@fs = font_size.to_i
end
@ystep = @fs + 5
end
def format(text : String, lexer : BaseLexer, io : IO) : Nil
pre, post = wrap_standalone
io << pre if standalone?
format_text(text, lexer, io)
io << post if standalone?
end
# Wrap text into a full HTML document, including the CSS for the theme
def wrap_standalone
output = String.build do |outp|
outp << %(<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.0//EN" "http://www.w3.org/TR/2001/REC-SVG-20010904/DTD/svg10.dtd">
<svg xmlns="http://www.w3.org/2000/svg">
<g font-family="#{self.@font_family}" font-size="#{self.@font_size}">)
end
{output.to_s, "</g></svg>"}
end
private def line_label(i : Int32, x : Int32, y : Int32) : String
line_label = "#{i + 1}".rjust(4).ljust(5)
line_style = highlighted?(i + 1) ? "font-weight=\"#{@weight_of_bold}\"" : ""
line_id = linkable_line_numbers? ? "id=\"#{line_number_id_prefix}#{i + 1}\"" : ""
%(<text #{line_style} #{line_id} x="#{4*ystep}" y="#{y}" text-anchor="end">#{line_label}</text>)
end
def format_text(text : String, lexer : BaseLexer, outp : IO)
x = 0
y = ystep
i = 0
line_x = x
line_x += 5 * ystep if line_numbers?
tokenizer = lexer.tokenizer(text)
outp << line_label(i, x, y) if line_numbers?
outp << %(<text x="#{line_x}" y="#{y}" xml:space="preserve">)
tokenizer.each do |token|
if token[:value].ends_with? "\n"
outp << "<tspan #{get_style(token[:type])}>#{HTML.escape(token[:value][0...-1])}</tspan>"
outp << "</text>"
x = 0
y += ystep
i += 1
outp << line_label(i, x, y) if line_numbers?
outp << %(<text x="#{line_x}" y="#{y}" xml:space="preserve">)
else
outp << "<tspan#{get_style(token[:type])}>#{HTML.escape(token[:value])}</tspan>"
x += token[:value].size * ystep
end
end
outp << "</text>"
end
# Given a token type, return the style.
def get_style(token : String) : String
if !theme.styles.has_key? token
# Themes don't contain information for each specific
# token type. However, they may contain information
# for a parent style. Worst case, we go to the root
# (Background) style.
parent = theme.style_parents(token).reverse.find { |dad|
theme.styles.has_key?(dad)
}
theme.styles[token] = theme.styles[parent]
end
output = String.build do |outp|
style = theme.styles[token]
outp << " fill=\"##{style.color.try &.hex}\"" if style.color
# No support for background color or border in SVG
outp << " font-weight=\"#{@weight_of_bold}\"" if style.bold
outp << " font-weight=\"normal\"" if style.bold == false
outp << " font-style=\"italic\"" if style.italic
outp << " font-style=\"normal\"" if style.italic == false
outp << " text-decoration=\"underline\"" if style.underline
outp << " text-decoration=\"none" if style.underline == false
end
output
end
end
end

View File

@ -6,21 +6,11 @@ require "crystal/syntax_highlighter"
module Tartrazine
class LexerFiles
extend BakedFileSystem
macro bake_selected_lexers
{% for lexer in env("TT_LEXERS").split "," %}
bake_file {{ lexer }}+".xml", {{ read_file "lexers/" + lexer + ".xml" }}
{% end %}
end
{% if flag?(:nolexers) %}
bake_selected_lexers
{% else %}
bake_folder "../lexers", __DIR__
{% end %}
bake_folder "../lexers", __DIR__
end
# Get the lexer object for a language name
# FIXME: support mimetypes
def self.lexer(name : String? = nil, filename : String? = nil, mimetype : String? = nil) : BaseLexer
return lexer_by_name(name) if name && name != "autodetect"
return lexer_by_filename(filename) if filename
@ -43,8 +33,6 @@ module Tartrazine
raise Exception.new("Unknown lexer: #{name}") if lexer_file_name.nil?
RegexLexer.from_xml(LexerFiles.get("/#{lexer_file_name}.xml").gets_to_end)
rescue ex : BakedFileSystem::NoSuchFileError
raise Exception.new("Unknown lexer: #{name}")
end
private def self.lexer_by_filename(filename : String) : BaseLexer
@ -96,8 +84,7 @@ module Tartrazine
# Return a list of all lexers
def self.lexers : Array(String)
file_map = LexerFiles.files.map(&.path)
LEXERS_BY_NAME.keys.select { |k| file_map.includes?("/#{k}.xml") }.sort!
LEXERS_BY_NAME.keys.sort!
end
# A token, the output of the tokenizer

View File

@ -4,10 +4,6 @@ require "./tartrazine"
HELP = <<-HELP
tartrazine: a syntax highlighting tool
You can use the CLI to generate HTML, terminal, JSON or SVG output
from a source file using different themes.
Keep in mind that not all formatters support all features.
Usage:
tartrazine (-h, --help)
tartrazine FILE -f html [-t theme][--standalone][--line-numbers]
@ -15,10 +11,6 @@ Usage:
tartrazine -f html -t theme --css
tartrazine FILE -f terminal [-t theme][-l lexer][--line-numbers]
[-o output]
tartrazine FILE -f svg [-t theme][--standalone][--line-numbers]
[-l lexer][-o output]
tartrazine FILE -f png [-t theme][--line-numbers]
[-l lexer][-o output]
tartrazine FILE -f json [-o output]
tartrazine --list-themes
tartrazine --list-lexers
@ -26,7 +18,7 @@ Usage:
tartrazine --version
Options:
-f <formatter> Format to use (html, terminal, json, svg)
-f <formatter> Format to use (html, terminal, json)
-t <theme> Theme to use, see --list-themes [default: default-dark]
-l <lexer> Lexer (language) to use, see --list-lexers. Use more than
one lexer with "+" (e.g. jinja+yaml) [default: autodetect]
@ -79,15 +71,6 @@ if options["-f"]
formatter.theme = theme
when "json"
formatter = Tartrazine::Json.new
when "svg"
formatter = Tartrazine::Svg.new
formatter.standalone = options["--standalone"] != nil
formatter.line_numbers = options["--line-numbers"] != nil
formatter.theme = theme
when "png"
formatter = Tartrazine::Png.new
formatter.line_numbers = options["--line-numbers"] != nil
formatter.theme = theme
else
puts "Invalid formatter: #{formatter}"
exit 1

View File

@ -1 +1 @@
require "../spec/tartrazine_spec.cr"
require "../spec/**"