mirror of
https://github.com/ralsina/tartrazine.git
synced 2024-09-20 23:41:21 +00:00
Merge pull request #58 from mcarmonaa/feature-benchmarks
Benchmarks and results
This commit is contained in:
commit
608949faad
1
.gitignore
vendored
1
.gitignore
vendored
@ -1 +1,2 @@
|
|||||||
.linguist
|
.linguist
|
||||||
|
benchmarks/output
|
||||||
|
10
Makefile
10
Makefile
@ -32,6 +32,16 @@ code-generate: $(LINGUIST_PATH)
|
|||||||
mkdir -p data
|
mkdir -p data
|
||||||
go run internal/code-generator/main.go
|
go run internal/code-generator/main.go
|
||||||
|
|
||||||
|
benchmarks: $(LINGUIST_PATH)
|
||||||
|
go test -run=NONE -bench=. && benchmarks/linguist-total.sh
|
||||||
|
|
||||||
|
benchmarks-samples: $(LINGUIST_PATH)
|
||||||
|
go test -run=NONE -bench=. -benchtime=5us && benchmarks/linguist-samples.rb
|
||||||
|
|
||||||
|
benchmarks-slow: $(LINGUST_PATH)
|
||||||
|
mkdir -p benchmarks/output && go test -run=NONE -bench=. -slow -benchtime=100ms -timeout=100h >benchmarks/output/enry_samples.bench && \
|
||||||
|
benchmarks/linguist-samples.rb 5 >benchmarks/output/linguist_samples.bench
|
||||||
|
|
||||||
clean:
|
clean:
|
||||||
rm -rf $(LINGUIST_PATH)
|
rm -rf $(LINGUIST_PATH)
|
||||||
|
|
||||||
|
45
README.md
45
README.md
@ -1,6 +1,6 @@
|
|||||||
# enry [![GoDoc](https://godoc.org/gopkg.in/src-d/enry.v1?status.svg)](https://godoc.org/gopkg.in/src-d/enry.v1) [![Build Status](https://travis-ci.org/src-d/enry.svg?branch=master)](https://travis-ci.org/src-d/enry) [![codecov](https://codecov.io/gh/src-d/enry/branch/master/graph/badge.svg)](https://codecov.io/gh/src-d/enry)
|
# enry [![GoDoc](https://godoc.org/gopkg.in/src-d/enry.v1?status.svg)](https://godoc.org/gopkg.in/src-d/enry.v1) [![Build Status](https://travis-ci.org/src-d/enry.svg?branch=master)](https://travis-ci.org/src-d/enry) [![codecov](https://codecov.io/gh/src-d/enry/branch/master/graph/badge.svg)](https://codecov.io/gh/src-d/enry)
|
||||||
|
|
||||||
File programming language detector and toolbox to ignore binary or vendored files. *enry*, started as a port to _Go_ of the original [linguist](https://github.com/github/linguist) _Ruby_ library, that has an improved *performance of 100x*.
|
File programming language detector and toolbox to ignore binary or vendored files. *enry*, started as a port to _Go_ of the original [linguist](https://github.com/github/linguist) _Ruby_ library, that has an improved *2x performance*.
|
||||||
|
|
||||||
|
|
||||||
Installation
|
Installation
|
||||||
@ -18,8 +18,9 @@ To build enry's CLI you must run
|
|||||||
|
|
||||||
it generates a binary in the project's root directory called `enry`. You can move this binary to anywhere in your `PATH`.
|
it generates a binary in the project's root directory called `enry`. You can move this binary to anywhere in your `PATH`.
|
||||||
|
|
||||||
|
|
||||||
Examples
|
Examples
|
||||||
--------
|
------------
|
||||||
|
|
||||||
```go
|
```go
|
||||||
lang, safe := enry.GetLanguageByExtension("foo.go")
|
lang, safe := enry.GetLanguageByExtension("foo.go")
|
||||||
@ -55,7 +56,7 @@ langs := enry.GetLanguagesByFilename("Gemfile", "<content>", []string{})
|
|||||||
|
|
||||||
|
|
||||||
CLI
|
CLI
|
||||||
-----------------
|
------------
|
||||||
|
|
||||||
You can use enry as a command,
|
You can use enry as a command,
|
||||||
|
|
||||||
@ -115,7 +116,7 @@ Note that even if enry's CLI is compatible with linguist's, its main point is th
|
|||||||
|
|
||||||
|
|
||||||
Development
|
Development
|
||||||
-----------
|
------------
|
||||||
|
|
||||||
*enry* re-uses parts of original [linguist](https://github.com/github/linguist) especially data in `languages.yml` to generate internal data structures. In oreder to update to latest upstream run
|
*enry* re-uses parts of original [linguist](https://github.com/github/linguist) especially data in `languages.yml` to generate internal data structures. In oreder to update to latest upstream run
|
||||||
|
|
||||||
@ -139,8 +140,40 @@ Using [linguist/samples](https://github.com/github/linguist/tree/master/samples)
|
|||||||
* all files for SQL language fall to the classifier because we don't parse this [disambiguator expresion](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb#L433) for `*.sql` files right. This expression doesn't comply with the pattern for the rest of [heuristics.rb](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) file.
|
* all files for SQL language fall to the classifier because we don't parse this [disambiguator expresion](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb#L433) for `*.sql` files right. This expression doesn't comply with the pattern for the rest of [heuristics.rb](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) file.
|
||||||
|
|
||||||
|
|
||||||
|
Benchmarks
|
||||||
|
------------
|
||||||
|
|
||||||
|
Enry's language detection has been compared with Linguist's language detection. In order to do that, linguist's project directory [*linguist/samples*](https://github.com/github/linguist/tree/master/samples) was used as a set of files to run benchmarks against.
|
||||||
|
|
||||||
|
Following results were obtained:
|
||||||
|
|
||||||
|
![histogram](https://raw.githubusercontent.com/src-c/enry/master/benchmarks/histogram/distribution.jpg)
|
||||||
|
|
||||||
|
The histogram represents the number of files for which spent time in language detection was in the range of the time interval indicated in x axis.
|
||||||
|
|
||||||
|
So reviewing the comparison enry/linguist, you can see the most of the files were detected in less time than linguist does.
|
||||||
|
|
||||||
|
We detected some few cases enry turns slower than linguist. This is due to Golang's regexp engine being slower than Ruby's, which uses [oniguruma](https://github.com/kkos/oniguruma) library, written in C.
|
||||||
|
|
||||||
|
You can find scripts and additional information (as software and hardware used, and benchmarks' results per sample file) in [*benchmarks*](https://github.com/src-d/enry/tree/master/benchmarks) directory.
|
||||||
|
|
||||||
|
If you want to reproduce the same experiment you can run:
|
||||||
|
|
||||||
|
benchmarks/run.sh
|
||||||
|
|
||||||
|
from the root's project directory and It runs benchmarks for enry and linguist, parse the output, create csv files and create a histogram (you must have installed [gnuplot](http://gnuplot.info) in your system to get the histogram). It can take to much time, so to run local benchmarks to take a quick look you can run either:
|
||||||
|
|
||||||
|
make benchmarks
|
||||||
|
|
||||||
|
to get time averages for main detection function and strategies for the whole samples set or:
|
||||||
|
|
||||||
|
make benchmarks-samples
|
||||||
|
|
||||||
|
if you want see measures by sample file
|
||||||
|
|
||||||
|
|
||||||
Why Enry?
|
Why Enry?
|
||||||
---------
|
------------
|
||||||
|
|
||||||
In the movie [My Fair Lady](https://en.wikipedia.org/wiki/My_Fair_Lady), [Professor Henry Higgins](http://www.imdb.com/character/ch0011719/?ref_=tt_cl_t2) is one of the main characters. Henry is a linguist and at the very beginning of the movie enjoys guessing the nationality of people based on their accent.
|
In the movie [My Fair Lady](https://en.wikipedia.org/wiki/My_Fair_Lady), [Professor Henry Higgins](http://www.imdb.com/character/ch0011719/?ref_=tt_cl_t2) is one of the main characters. Henry is a linguist and at the very beginning of the movie enjoys guessing the nationality of people based on their accent.
|
||||||
|
|
||||||
@ -148,6 +181,6 @@ In the movie [My Fair Lady](https://en.wikipedia.org/wiki/My_Fair_Lady), [Profes
|
|||||||
|
|
||||||
|
|
||||||
License
|
License
|
||||||
-------
|
------------
|
||||||
|
|
||||||
MIT, see [LICENSE](LICENSE)
|
MIT, see [LICENSE](LICENSE)
|
||||||
|
194
benchmark_test.go
Normal file
194
benchmark_test.go
Normal file
@ -0,0 +1,194 @@
|
|||||||
|
package enry
|
||||||
|
|
||||||
|
import (
|
||||||
|
"flag"
|
||||||
|
"io/ioutil"
|
||||||
|
"log"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
const samplesDir = ".linguist/samples"
|
||||||
|
|
||||||
|
type sample struct {
|
||||||
|
filename string
|
||||||
|
content []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
slow bool
|
||||||
|
overcomeLanguage string
|
||||||
|
overcomeLanguages []string
|
||||||
|
samples []*sample
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestMain(m *testing.M) {
|
||||||
|
flag.BoolVar(&slow, "slow", false, "run benchmarks per sample for strategies too")
|
||||||
|
flag.Parse()
|
||||||
|
var err error
|
||||||
|
samples, err = getSamples(samplesDir)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
os.Exit(m.Run())
|
||||||
|
}
|
||||||
|
|
||||||
|
func getSamples(dir string) ([]*sample, error) {
|
||||||
|
samples := make([]*sample, 0, 2000)
|
||||||
|
err := filepath.Walk(dir, func(path string, info os.FileInfo, err error) error {
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if info.IsDir() {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
content, err := ioutil.ReadFile(path)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
s := &sample{
|
||||||
|
filename: path,
|
||||||
|
content: content,
|
||||||
|
}
|
||||||
|
|
||||||
|
samples = append(samples, s)
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
|
||||||
|
return samples, err
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGetLanguageTotal(b *testing.B) {
|
||||||
|
if slow {
|
||||||
|
b.SkipNow()
|
||||||
|
}
|
||||||
|
|
||||||
|
var o string
|
||||||
|
b.Run("GetLanguage()_TOTAL", func(b *testing.B) {
|
||||||
|
for n := 0; n < b.N; n++ {
|
||||||
|
for _, sample := range samples {
|
||||||
|
o = GetLanguage(sample.filename, sample.content)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
overcomeLanguage = o
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkClassifyTotal(b *testing.B) {
|
||||||
|
if slow {
|
||||||
|
b.SkipNow()
|
||||||
|
}
|
||||||
|
|
||||||
|
var o []string
|
||||||
|
b.Run("Classify()_TOTAL", func(b *testing.B) {
|
||||||
|
for n := 0; n < b.N; n++ {
|
||||||
|
for _, sample := range samples {
|
||||||
|
o = DefaultClassifier.Classify(sample.content, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
overcomeLanguages = o
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkStrategiesTotal(b *testing.B) {
|
||||||
|
if slow {
|
||||||
|
b.SkipNow()
|
||||||
|
}
|
||||||
|
|
||||||
|
benchmarks := []struct {
|
||||||
|
name string
|
||||||
|
strategy Strategy
|
||||||
|
candidates []string
|
||||||
|
}{
|
||||||
|
{name: "GetLanguagesByModeline()_TOTAL", strategy: GetLanguagesByModeline},
|
||||||
|
{name: "GetLanguagesByFilename()_TOTAL", strategy: GetLanguagesByFilename},
|
||||||
|
{name: "GetLanguagesByShebang()_TOTAL", strategy: GetLanguagesByShebang},
|
||||||
|
{name: "GetLanguagesByExtension()_TOTAL", strategy: GetLanguagesByExtension},
|
||||||
|
{name: "GetLanguagesByContent()_TOTAL", strategy: GetLanguagesByContent},
|
||||||
|
}
|
||||||
|
|
||||||
|
var o []string
|
||||||
|
for _, benchmark := range benchmarks {
|
||||||
|
b.Run(benchmark.name, func(b *testing.B) {
|
||||||
|
for n := 0; n < b.N; n++ {
|
||||||
|
for _, sample := range samples {
|
||||||
|
o = benchmark.strategy(sample.filename, sample.content, benchmark.candidates)
|
||||||
|
}
|
||||||
|
|
||||||
|
overcomeLanguages = o
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGetLanguagePerSample(b *testing.B) {
|
||||||
|
if !slow {
|
||||||
|
b.SkipNow()
|
||||||
|
}
|
||||||
|
|
||||||
|
var o string
|
||||||
|
for _, sample := range samples {
|
||||||
|
b.Run("GetLanguage()_SAMPLE_"+sample.filename, func(b *testing.B) {
|
||||||
|
for n := 0; n < b.N; n++ {
|
||||||
|
o = GetLanguage(sample.filename, sample.content)
|
||||||
|
}
|
||||||
|
|
||||||
|
overcomeLanguage = o
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkClassifyPerSample(b *testing.B) {
|
||||||
|
if !slow {
|
||||||
|
b.SkipNow()
|
||||||
|
}
|
||||||
|
|
||||||
|
var o []string
|
||||||
|
for _, sample := range samples {
|
||||||
|
b.Run("Classify()_SAMPLE_"+sample.filename, func(b *testing.B) {
|
||||||
|
for n := 0; n < b.N; n++ {
|
||||||
|
o = DefaultClassifier.Classify(sample.content, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
overcomeLanguages = o
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkStrategiesPerSample(b *testing.B) {
|
||||||
|
if !slow {
|
||||||
|
b.SkipNow()
|
||||||
|
}
|
||||||
|
|
||||||
|
benchmarks := []struct {
|
||||||
|
name string
|
||||||
|
strategy Strategy
|
||||||
|
candidates []string
|
||||||
|
}{
|
||||||
|
{name: "GetLanguagesByModeline()_SAMPLE_", strategy: GetLanguagesByModeline},
|
||||||
|
{name: "GetLanguagesByFilename()_SAMPLE_", strategy: GetLanguagesByFilename},
|
||||||
|
{name: "GetLanguagesByShebang()_SAMPLE_", strategy: GetLanguagesByShebang},
|
||||||
|
{name: "GetLanguagesByExtension()_SAMPLE_", strategy: GetLanguagesByExtension},
|
||||||
|
{name: "GetLanguagesByContent()_SAMPLE_", strategy: GetLanguagesByContent},
|
||||||
|
}
|
||||||
|
|
||||||
|
var o []string
|
||||||
|
for _, benchmark := range benchmarks {
|
||||||
|
for _, sample := range samples {
|
||||||
|
b.Run(benchmark.name+sample.filename, func(b *testing.B) {
|
||||||
|
for n := 0; n < b.N; n++ {
|
||||||
|
o = benchmark.strategy(sample.filename, sample.content, benchmark.candidates)
|
||||||
|
}
|
||||||
|
|
||||||
|
overcomeLanguages = o
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
6
benchmarks/csv/enry-distribution.csv
Normal file
6
benchmarks/csv/enry-distribution.csv
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
timeInterval,enry,numberOfFiles
|
||||||
|
1us-10us,enry,96
|
||||||
|
10us-100us,enry,1244
|
||||||
|
100us-1ms,enry,321
|
||||||
|
1ms-10ms,enry,135
|
||||||
|
10ms-100ms,enry,43
|
|
12874
benchmarks/csv/enry-samples.csv
Normal file
12874
benchmarks/csv/enry-samples.csv
Normal file
File diff suppressed because it is too large
Load Diff
8
benchmarks/csv/enry-total.csv
Normal file
8
benchmarks/csv/enry-total.csv
Normal file
@ -0,0 +1,8 @@
|
|||||||
|
function,tool,iterations,ns/op
|
||||||
|
GetLanguage(),enry,100,1915861259
|
||||||
|
Classify(),enry,5,39977943775
|
||||||
|
GetLanguagesByModeline(),enry,1000,196571071
|
||||||
|
GetLanguagesByFilename(),enry,2000000,89774
|
||||||
|
GetLanguagesByShebang(),enry,100000,1892569
|
||||||
|
GetLanguagesByExtension(),enry,200000,921160
|
||||||
|
GetLanguagesByContent(),enry,1000,286159159
|
|
6
benchmarks/csv/linguist-distribution.csv
Normal file
6
benchmarks/csv/linguist-distribution.csv
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
timeInterval,linguist,numberOfFiles
|
||||||
|
1us-10us,linguist,0
|
||||||
|
10us-100us,linguist,74
|
||||||
|
100us-1ms,linguist,920
|
||||||
|
1ms-10ms,linguist,788
|
||||||
|
10ms-100ms,linguist,57
|
|
12874
benchmarks/csv/linguist-samples.csv
Normal file
12874
benchmarks/csv/linguist-samples.csv
Normal file
File diff suppressed because it is too large
Load Diff
8
benchmarks/csv/linguist-total.csv
Normal file
8
benchmarks/csv/linguist-total.csv
Normal file
@ -0,0 +1,8 @@
|
|||||||
|
function,tool,iterations,ns/op
|
||||||
|
GetLanguage(),linguist,5,3979096800
|
||||||
|
Classify(),linguist,5,178253431800
|
||||||
|
GetLanguagesByModeline(),linguist,5,2582204000
|
||||||
|
GetLanguagesByFilename(),linguist,5,2688800
|
||||||
|
GetLanguagesByShebang(),linguist,5,77155200
|
||||||
|
GetLanguagesByExtension(),linguist,5,6688800
|
||||||
|
GetLanguagesByContent(),linguist,5,161719000
|
|
BIN
benchmarks/histogram/distribution.png
Normal file
BIN
benchmarks/histogram/distribution.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 17 KiB |
126
benchmarks/linguist-samples.rb
Executable file
126
benchmarks/linguist-samples.rb
Executable file
@ -0,0 +1,126 @@
|
|||||||
|
#!/usr/bin/env ruby
|
||||||
|
|
||||||
|
require 'benchmark'
|
||||||
|
require 'linguist'
|
||||||
|
|
||||||
|
iterations = (ARGV[0] || 1).to_i
|
||||||
|
|
||||||
|
# BenchBlob wraps a FileBlob to keep data loaded and to clean attributes added by language detection.
|
||||||
|
class BenchBlob < Linguist::FileBlob
|
||||||
|
attr_accessor :data
|
||||||
|
|
||||||
|
def initialize(path, base_path = nil)
|
||||||
|
super
|
||||||
|
@data = File.read(@fullpath)
|
||||||
|
end
|
||||||
|
|
||||||
|
def clean
|
||||||
|
@_mime_type = nil
|
||||||
|
@detect_encoding = nil
|
||||||
|
@lines = nil
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
def get_samples(root)
|
||||||
|
samples = Array.new
|
||||||
|
Dir.foreach(root) do |file|
|
||||||
|
path = File.join(root, file)
|
||||||
|
if file == "." or file == ".."
|
||||||
|
next
|
||||||
|
elsif File.directory?(path)
|
||||||
|
get_samples(path).each do |blob|
|
||||||
|
samples << blob
|
||||||
|
end
|
||||||
|
else
|
||||||
|
samples << BenchBlob.new(path)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
return samples
|
||||||
|
end
|
||||||
|
|
||||||
|
samples = get_samples('.linguist/samples')
|
||||||
|
languages = Linguist::Language.all
|
||||||
|
|
||||||
|
samples.each do |blob|
|
||||||
|
sample_name = blob.path.gsub(/\s/, '_')
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguage()_SAMPLE_' + sample_name + ' ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
Linguist::detect(blob)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
samples.each do |blob|
|
||||||
|
sample_name = blob.path.gsub(/\s/, '_')
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('Classify()_SAMPLE_' + sample_name + ' ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
Linguist::Classifier.classify(Linguist::Samples.cache, blob.data)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
samples.each do |blob|
|
||||||
|
sample_name = blob.path.gsub(/\s/, '_')
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByModeline()_SAMPLE_' + sample_name + ' ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
Linguist::Strategy::Modeline.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
samples.each do |blob|
|
||||||
|
sample_name = blob.path.gsub(/\s/, '_')
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByFilename()_SAMPLE_' + sample_name + ' ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
Linguist::Strategy::Filename.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
samples.each do |blob|
|
||||||
|
sample_name = blob.path.gsub(/\s/, '_')
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByShebang()_SAMPLE_' + sample_name + ' ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
Linguist::Shebang.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
samples.each do |blob|
|
||||||
|
sample_name = blob.path.gsub(/\s/, '_')
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByExtension()_SAMPLE_' + sample_name + ' ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
Linguist::Strategy::Extension.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
samples.each do |blob|
|
||||||
|
sample_name = blob.path.gsub(/\s/, '_')
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByContent()_SAMPLE_' + sample_name + ' ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
Linguist::Heuristics.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
120
benchmarks/linguist-total.rb
Executable file
120
benchmarks/linguist-total.rb
Executable file
@ -0,0 +1,120 @@
|
|||||||
|
#!/usr/bin/env ruby
|
||||||
|
|
||||||
|
require 'benchmark'
|
||||||
|
require 'linguist'
|
||||||
|
|
||||||
|
iterations = (ARGV[0] || 1).to_i
|
||||||
|
|
||||||
|
# BenchBlob wraps a FileBlob to keep data loaded and to clean attributes added by language detection.
|
||||||
|
class BenchBlob < Linguist::FileBlob
|
||||||
|
attr_accessor :data
|
||||||
|
attr_accessor :fullpath
|
||||||
|
|
||||||
|
def initialize(path, base_path = nil)
|
||||||
|
super
|
||||||
|
@data = File.read(@fullpath)
|
||||||
|
end
|
||||||
|
|
||||||
|
def clean
|
||||||
|
@_mime_type = nil
|
||||||
|
@detect_encoding = nil
|
||||||
|
@lines = nil
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
def get_samples(root)
|
||||||
|
samples = Array.new
|
||||||
|
Dir.foreach(root) do |file|
|
||||||
|
path = File.join(root, file)
|
||||||
|
if file == "." or file == ".."
|
||||||
|
next
|
||||||
|
elsif File.directory?(path)
|
||||||
|
get_samples(path).each do |blob|
|
||||||
|
samples << blob
|
||||||
|
end
|
||||||
|
else
|
||||||
|
samples << BenchBlob.new(path)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
return samples
|
||||||
|
end
|
||||||
|
|
||||||
|
samples = get_samples('.linguist/samples')
|
||||||
|
languages = Linguist::Language.all
|
||||||
|
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
time = bm.report('GetLanguage()_TOTAL ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
samples.each do |blob|
|
||||||
|
Linguist::detect(blob)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('Classify()_TOTAL ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
samples.each do |blob|
|
||||||
|
Linguist::Classifier.classify(Linguist::Samples.cache, blob.data)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByModeline()_TOTAL ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
samples.each do |blob|
|
||||||
|
Linguist::Strategy::Modeline.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByFilename()_TOTAL ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
samples.each do |blob|
|
||||||
|
Linguist::Strategy::Filename.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByShebang()_TOTAL ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
samples.each do |blob|
|
||||||
|
Linguist::Shebang.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByExtension()_TOTAL ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
samples.each do |blob|
|
||||||
|
Linguist::Strategy::Extension.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
Benchmark.bmbm do |bm|
|
||||||
|
bm.report('GetLanguagesByContent()_TOTAL ' + iterations.to_s) do
|
||||||
|
iterations.times do
|
||||||
|
samples.each do |blob|
|
||||||
|
Linguist::Heuristics.call(blob, languages)
|
||||||
|
blob.clean
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
5
benchmarks/parse.sh
Executable file
5
benchmarks/parse.sh
Executable file
@ -0,0 +1,5 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
cd benchmarks/output && go run ../parser/main.go -outdir ../csv && \
|
||||||
|
cd ../csv && go run ../parser/main.go -distribution
|
||||||
|
|
386
benchmarks/parser/main.go
Normal file
386
benchmarks/parser/main.go
Normal file
@ -0,0 +1,386 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"bytes"
|
||||||
|
"encoding/csv"
|
||||||
|
"flag"
|
||||||
|
"fmt"
|
||||||
|
"io/ioutil"
|
||||||
|
"log"
|
||||||
|
"math"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"runtime"
|
||||||
|
"sort"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
// functions benchmarked
|
||||||
|
getLanguageFunc = "GetLanguage()"
|
||||||
|
classifyFunc = "Classify()"
|
||||||
|
modelineFunc = "GetLanguagesByModeline()"
|
||||||
|
filenameFunc = "GetLanguagesByFilename()"
|
||||||
|
shebangFunc = "GetLanguagesByShebang()"
|
||||||
|
extensionFunc = "GetLanguagesByExtension()"
|
||||||
|
contentFunc = "GetLanguagesByContent()"
|
||||||
|
|
||||||
|
// benchmark's outputs
|
||||||
|
enryTotalBench = "enry_total.bench"
|
||||||
|
enrySamplesBench = "enry_samples.bench"
|
||||||
|
linguistTotalBench = "linguist_total.bench"
|
||||||
|
linguistSamplesBench = "linguist_samples.bench"
|
||||||
|
|
||||||
|
// files to generate
|
||||||
|
enryTotalCSV = "enry-total.csv"
|
||||||
|
enrySamplesCSV = "enry-samples.csv"
|
||||||
|
linguistTotalCSV = "linguist-total.csv"
|
||||||
|
linguistSamplesCSV = "linguist-samples.csv"
|
||||||
|
|
||||||
|
// files to generate with flag distribution
|
||||||
|
enryDistributionCSV = "enry-distribution.csv"
|
||||||
|
linguistDistributionCSV = "linguist-distribution.csv"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// flags
|
||||||
|
distribution bool
|
||||||
|
outDir string
|
||||||
|
|
||||||
|
enryFunctions = []string{getLanguageFunc, classifyFunc, modelineFunc, filenameFunc, shebangFunc, extensionFunc, contentFunc}
|
||||||
|
distributionIntervals = []string{"1us-10us", "10us-100us", "100us-1ms", "1ms-10ms", "10ms-100ms"}
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
flag.BoolVar(&distribution, "distribution", false, "generate enry-distribuition.csv and linguist-distribution.csv")
|
||||||
|
flag.StringVar(&outDir, "outdir", "", "path to leave csv files")
|
||||||
|
flag.Parse()
|
||||||
|
|
||||||
|
if distribution {
|
||||||
|
generateDistributionCSV()
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
generateCSV()
|
||||||
|
}
|
||||||
|
|
||||||
|
func generateDistributionCSV() {
|
||||||
|
CSVFiles := []struct {
|
||||||
|
in string
|
||||||
|
out string
|
||||||
|
tool string
|
||||||
|
}{
|
||||||
|
{in: enrySamplesCSV, out: enryDistributionCSV, tool: "enry"},
|
||||||
|
{in: linguistSamplesCSV, out: linguistDistributionCSV, tool: "linguist"},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, CSVFile := range CSVFiles {
|
||||||
|
f, err := os.Open(CSVFile.in)
|
||||||
|
if err != nil {
|
||||||
|
log.Println(err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
r := csv.NewReader(f)
|
||||||
|
CSVSamples, err := r.ReadAll()
|
||||||
|
if err != nil {
|
||||||
|
log.Println(err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
CSVDistribution, err := buildDistribution(CSVSamples[1:], CSVFile.tool)
|
||||||
|
if err != nil {
|
||||||
|
log.Println(err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := writeCSV(CSVDistribution, filepath.Join(outDir, CSVFile.out)); err != nil {
|
||||||
|
log.Println(err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func buildDistribution(CSVSamples [][]string, tool string) ([][]string, error) {
|
||||||
|
count := make(map[string]int, len(distributionIntervals))
|
||||||
|
for _, row := range CSVSamples {
|
||||||
|
if row[1] != getLanguageFunc {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
num, err := strconv.ParseFloat(row[len(row)-1], 64)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
arrangeByTime(count, num)
|
||||||
|
}
|
||||||
|
|
||||||
|
CSVDistribution := make([][]string, 0, len(count)+1)
|
||||||
|
firstLine := []string{"timeInterval", tool, "numberOfFiles"}
|
||||||
|
CSVDistribution = append(CSVDistribution, firstLine)
|
||||||
|
for _, interval := range distributionIntervals {
|
||||||
|
number := strconv.FormatInt(int64(count[interval]), 10)
|
||||||
|
row := []string{interval, tool, number}
|
||||||
|
CSVDistribution = append(CSVDistribution, row)
|
||||||
|
}
|
||||||
|
|
||||||
|
printDistributionInfo(count, tool)
|
||||||
|
return CSVDistribution, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func printDistributionInfo(count map[string]int, tool string) {
|
||||||
|
total := 0
|
||||||
|
for _, v := range count {
|
||||||
|
total += v
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Println(tool, "files", total)
|
||||||
|
fmt.Println("Distribution")
|
||||||
|
for _, interval := range distributionIntervals {
|
||||||
|
fmt.Println("\t", interval, count[interval])
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Println("Percentage")
|
||||||
|
for _, interval := range distributionIntervals {
|
||||||
|
p := (float64(count[interval]) / float64(total)) * 100.00
|
||||||
|
fmt.Printf("\t %s %f%%\n", interval, p)
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Printf("\n\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
func arrangeByTime(count map[string]int, num float64) {
|
||||||
|
switch {
|
||||||
|
case num > 1000.00 && num <= 10000.00:
|
||||||
|
count[distributionIntervals[0]]++
|
||||||
|
case num > 10000.00 && num <= 100000.00:
|
||||||
|
count[distributionIntervals[1]]++
|
||||||
|
case num > 100000.00 && num <= 1000000.00:
|
||||||
|
count[distributionIntervals[2]]++
|
||||||
|
case num > 1000000.00 && num <= 10000000.00:
|
||||||
|
count[distributionIntervals[3]]++
|
||||||
|
case num > 10000000.00 && num <= 100000000.00:
|
||||||
|
count[distributionIntervals[4]]++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func writeCSV(CSVData [][]string, outPath string) error {
|
||||||
|
out, err := os.Create(outPath)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
w := csv.NewWriter(out)
|
||||||
|
w.WriteAll(CSVData)
|
||||||
|
|
||||||
|
if err := w.Error(); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
type parse func(data []byte, tool string) ([][]string, error)
|
||||||
|
|
||||||
|
func generateCSV() {
|
||||||
|
bmFiles := []struct {
|
||||||
|
in string
|
||||||
|
out string
|
||||||
|
tool string
|
||||||
|
parse parse
|
||||||
|
}{
|
||||||
|
{in: enryTotalBench, out: enryTotalCSV, tool: "enry", parse: parseTotal},
|
||||||
|
{in: linguistTotalBench, out: linguistTotalCSV, tool: "linguist", parse: parseTotal},
|
||||||
|
{in: enrySamplesBench, out: enrySamplesCSV, tool: "enry", parse: parseSamples},
|
||||||
|
{in: linguistSamplesBench, out: linguistSamplesCSV, tool: "linguist", parse: parseSamples},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, bmFile := range bmFiles {
|
||||||
|
buf, err := ioutil.ReadFile(bmFile.in)
|
||||||
|
if err != nil {
|
||||||
|
log.Println(err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
info, err := bmFile.parse(buf, bmFile.tool)
|
||||||
|
if err != nil {
|
||||||
|
log.Println(err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := writeCSV(info, filepath.Join(outDir, bmFile.out)); err != nil {
|
||||||
|
log.Println(err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseTotal(data []byte, tool string) ([][]string, error) {
|
||||||
|
const totalLine = "_TOTAL"
|
||||||
|
parsedInfo := map[string][]string{}
|
||||||
|
buf := bufio.NewScanner(bytes.NewReader(data))
|
||||||
|
for buf.Scan() {
|
||||||
|
line := buf.Text()
|
||||||
|
if strings.Contains(line, totalLine) {
|
||||||
|
split := strings.Fields(line)
|
||||||
|
row, err := getRow(split, tool)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
parsedInfo[row[0]] = row
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := buf.Err(); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
firstLine := []string{"function", "tool", "iterations", "ns/op"}
|
||||||
|
return prepareInfoForCSV(parsedInfo, firstLine), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func getRow(line []string, tool string) ([]string, error) {
|
||||||
|
row := make([]string, 0, 3)
|
||||||
|
for _, function := range enryFunctions {
|
||||||
|
if strings.Contains(line[0], function) {
|
||||||
|
row = append(row, function)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
row = append(row, tool)
|
||||||
|
iterations := line[1]
|
||||||
|
row = append(row, iterations)
|
||||||
|
|
||||||
|
average, err := getAverage(line)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
row = append(row, average)
|
||||||
|
return row, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func getAverage(line []string) (string, error) {
|
||||||
|
average := line[len(line)-1]
|
||||||
|
if !strings.HasSuffix(average, ")") {
|
||||||
|
return line[2], nil
|
||||||
|
}
|
||||||
|
|
||||||
|
totalTime := strings.Trim(average, "() ")
|
||||||
|
time, err := strconv.ParseFloat(totalTime, 64)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
iterations := line[1]
|
||||||
|
i, err := strconv.ParseFloat(iterations, 64)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
avg := (time * math.Pow10(9)) / i
|
||||||
|
return fmt.Sprintf("%d", int(avg)), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func prepareInfoForCSV(parsedInfo map[string][]string, firstLine []string) [][]string {
|
||||||
|
info := createInfoWithFirstLine(firstLine, len(parsedInfo))
|
||||||
|
for _, function := range enryFunctions {
|
||||||
|
info = append(info, parsedInfo[function])
|
||||||
|
}
|
||||||
|
|
||||||
|
return info
|
||||||
|
}
|
||||||
|
|
||||||
|
func createInfoWithFirstLine(firstLine []string, sliceLength int) (info [][]string) {
|
||||||
|
if len(firstLine) > 0 {
|
||||||
|
info = make([][]string, 0, sliceLength+1)
|
||||||
|
info = append(info, firstLine)
|
||||||
|
} else {
|
||||||
|
info = make([][]string, 0, sliceLength)
|
||||||
|
}
|
||||||
|
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
type enryFuncs map[string][]string
|
||||||
|
|
||||||
|
func newEnryFuncs() enryFuncs {
|
||||||
|
return enryFuncs{
|
||||||
|
getLanguageFunc: nil,
|
||||||
|
classifyFunc: nil,
|
||||||
|
modelineFunc: nil,
|
||||||
|
filenameFunc: nil,
|
||||||
|
shebangFunc: nil,
|
||||||
|
extensionFunc: nil,
|
||||||
|
contentFunc: nil,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseSamples(data []byte, tool string) ([][]string, error) {
|
||||||
|
const sampleLine = "SAMPLE_"
|
||||||
|
parsedInfo := map[string]enryFuncs{}
|
||||||
|
buf := bufio.NewScanner(bytes.NewReader(data))
|
||||||
|
for buf.Scan() {
|
||||||
|
line := buf.Text()
|
||||||
|
if strings.Contains(line, sampleLine) {
|
||||||
|
split := strings.Fields(line)
|
||||||
|
name := getSampleName(split[0])
|
||||||
|
if _, ok := parsedInfo[name]; !ok {
|
||||||
|
parsedInfo[name] = newEnryFuncs()
|
||||||
|
}
|
||||||
|
|
||||||
|
row := make([]string, 0, 4)
|
||||||
|
row = append(row, name)
|
||||||
|
r, err := getRow(split, tool)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
row = append(row, r...)
|
||||||
|
function := row[1]
|
||||||
|
parsedInfo[name][function] = row
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := buf.Err(); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
firstLine := []string{"file", "function", "tool", "iterations", "ns/op"}
|
||||||
|
return prepareSamplesInfoForCSV(parsedInfo, firstLine), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func getSampleName(s string) string {
|
||||||
|
start := strings.Index(s, "SAMPLE_") + len("SAMPLE_")
|
||||||
|
suffix := fmt.Sprintf("-%d", runtime.GOMAXPROCS(-1))
|
||||||
|
name := strings.TrimSuffix(s[start:], suffix)
|
||||||
|
return name
|
||||||
|
}
|
||||||
|
|
||||||
|
func prepareSamplesInfoForCSV(parsedInfo map[string]enryFuncs, firstLine []string) [][]string {
|
||||||
|
info := createInfoWithFirstLine(firstLine, len(parsedInfo)*len(enryFunctions))
|
||||||
|
orderedKeys := sortKeys(parsedInfo)
|
||||||
|
for _, path := range orderedKeys {
|
||||||
|
sampleInfo := prepareInfoForCSV(parsedInfo[path], nil)
|
||||||
|
info = append(info, sampleInfo...)
|
||||||
|
}
|
||||||
|
|
||||||
|
return info
|
||||||
|
}
|
||||||
|
|
||||||
|
func sortKeys(parsedInfo map[string]enryFuncs) []string {
|
||||||
|
keys := make([]string, 0, len(parsedInfo))
|
||||||
|
for key := range parsedInfo {
|
||||||
|
keys = append(keys, key)
|
||||||
|
}
|
||||||
|
|
||||||
|
sort.Strings(keys)
|
||||||
|
return keys
|
||||||
|
}
|
21
benchmarks/plot-histogram.gp
Executable file
21
benchmarks/plot-histogram.gp
Executable file
@ -0,0 +1,21 @@
|
|||||||
|
#!/usr/bin/env gnuplot
|
||||||
|
|
||||||
|
set terminal png large font "arial,26" size 1920,1080
|
||||||
|
set output 'benchmarks/histogram/distribution.png'
|
||||||
|
|
||||||
|
set datafile separator comma
|
||||||
|
set key under
|
||||||
|
|
||||||
|
set style data histogram
|
||||||
|
set style histogram clustered gap 1 title offset 1,1
|
||||||
|
set style fill solid noborder
|
||||||
|
set boxwidth 0.95
|
||||||
|
set grid y
|
||||||
|
set bmargin 12
|
||||||
|
set autoscale
|
||||||
|
set title "Number of files per processing time"
|
||||||
|
|
||||||
|
plot newhistogram, 'benchmarks/csv/enry-distribution.csv' using 3:xtic(1) title "enry", 'benchmarks/csv/linguist-distribution.csv' using 3 title "linguist"
|
||||||
|
|
||||||
|
unset output
|
||||||
|
|
4
benchmarks/run-benchmarks.sh
Executable file
4
benchmarks/run-benchmarks.sh
Executable file
@ -0,0 +1,4 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
mkdir -p benchmarks/output && go test -run NONE -bench=. -benchtime=120s -timeout=100h >benchmarks/output/enry_total.bench && \
|
||||||
|
benchmarks/linguist-total.rb 5 >benchmarks/output/linguist_total.bench
|
4
benchmarks/run.sh
Executable file
4
benchmarks/run.sh
Executable file
@ -0,0 +1,4 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
benchmarks/run-benchmarks.sh && make benchmarks-slow && \
|
||||||
|
benchmarks/parse.sh && benchmarks/plot-histogram.gp
|
9
benchmarks/soft-hard-info.txt
Normal file
9
benchmarks/soft-hard-info.txt
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
# Hardware and software used to run benchmarks
|
||||||
|
|
||||||
|
Dell XPS 9360
|
||||||
|
Linux 4.11.6-3-ARCH #1 SMP PREEMPT Thu Jun 22 12:21:46 CEST 2017 x86_64
|
||||||
|
go version go1.8.3 linux/amd64
|
||||||
|
ruby 2.4.1p111 (2017-03-22 revision 58053) [x86_64-linux]
|
||||||
|
|
||||||
|
github/linguist/samples commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
438
data/content.go
438
data/content.go
@ -4,441 +4,439 @@ package data
|
|||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
import (
|
import "gopkg.in/toqueteos/substring.v1"
|
||||||
"regexp"
|
|
||||||
)
|
|
||||||
|
|
||||||
type languageMatcher func([]byte) []string
|
type languageMatcher func([]byte) []string
|
||||||
|
|
||||||
var ContentMatchers = map[string]languageMatcher{
|
var ContentMatchers = map[string]languageMatcher{
|
||||||
".asc": func(i []byte) []string {
|
".asc": func(i []byte) []string {
|
||||||
if asc_PublicKey_Matcher_0.Match(i) {
|
if asc_PublicKey_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Public Key"}
|
return []string{"Public Key"}
|
||||||
} else if asc_AsciiDoc_Matcher_0.Match(i) {
|
} else if asc_AsciiDoc_Matcher_0.Match(string(i)) {
|
||||||
return []string{"AsciiDoc"}
|
return []string{"AsciiDoc"}
|
||||||
} else if asc_AGSScript_Matcher_0.Match(i) {
|
} else if asc_AGSScript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"AGS Script"}
|
return []string{"AGS Script"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".bb": func(i []byte) []string {
|
".bb": func(i []byte) []string {
|
||||||
if bb_BlitzBasic_Matcher_0.Match(i) || bb_BlitzBasic_Matcher_1.Match(i) {
|
if bb_BlitzBasic_Matcher_0.Match(string(i)) || bb_BlitzBasic_Matcher_1.Match(string(i)) {
|
||||||
return []string{"BlitzBasic"}
|
return []string{"BlitzBasic"}
|
||||||
} else if bb_BitBake_Matcher_0.Match(i) {
|
} else if bb_BitBake_Matcher_0.Match(string(i)) {
|
||||||
return []string{"BitBake"}
|
return []string{"BitBake"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".builds": func(i []byte) []string {
|
".builds": func(i []byte) []string {
|
||||||
if builds_XML_Matcher_0.Match(i) {
|
if builds_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Text"}
|
return []string{"Text"}
|
||||||
},
|
},
|
||||||
".ch": func(i []byte) []string {
|
".ch": func(i []byte) []string {
|
||||||
if ch_xBase_Matcher_0.Match(i) {
|
if ch_xBase_Matcher_0.Match(string(i)) {
|
||||||
return []string{"xBase"}
|
return []string{"xBase"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".cl": func(i []byte) []string {
|
".cl": func(i []byte) []string {
|
||||||
if cl_CommonLisp_Matcher_0.Match(i) {
|
if cl_CommonLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Common Lisp"}
|
return []string{"Common Lisp"}
|
||||||
} else if cl_Cool_Matcher_0.Match(i) {
|
} else if cl_Cool_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Cool"}
|
return []string{"Cool"}
|
||||||
} else if cl_OpenCL_Matcher_0.Match(i) {
|
} else if cl_OpenCL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"OpenCL"}
|
return []string{"OpenCL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".cls": func(i []byte) []string {
|
".cls": func(i []byte) []string {
|
||||||
if cls_TeX_Matcher_0.Match(i) {
|
if cls_TeX_Matcher_0.Match(string(i)) {
|
||||||
return []string{"TeX"}
|
return []string{"TeX"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".cs": func(i []byte) []string {
|
".cs": func(i []byte) []string {
|
||||||
if cs_Smalltalk_Matcher_0.Match(i) {
|
if cs_Smalltalk_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Smalltalk"}
|
return []string{"Smalltalk"}
|
||||||
} else if cs_CSharp_Matcher_0.Match(i) || cs_CSharp_Matcher_1.Match(i) {
|
} else if cs_CSharp_Matcher_0.Match(string(i)) || cs_CSharp_Matcher_1.Match(string(i)) {
|
||||||
return []string{"C#"}
|
return []string{"C#"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".d": func(i []byte) []string {
|
".d": func(i []byte) []string {
|
||||||
if d_D_Matcher_0.Match(i) {
|
if d_D_Matcher_0.Match(string(i)) {
|
||||||
return []string{"D"}
|
return []string{"D"}
|
||||||
} else if d_DTrace_Matcher_0.Match(i) {
|
} else if d_DTrace_Matcher_0.Match(string(i)) {
|
||||||
return []string{"DTrace"}
|
return []string{"DTrace"}
|
||||||
} else if d_Makefile_Matcher_0.Match(i) {
|
} else if d_Makefile_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Makefile"}
|
return []string{"Makefile"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".ecl": func(i []byte) []string {
|
".ecl": func(i []byte) []string {
|
||||||
if ecl_ECLiPSe_Matcher_0.Match(i) {
|
if ecl_ECLiPSe_Matcher_0.Match(string(i)) {
|
||||||
return []string{"ECLiPSe"}
|
return []string{"ECLiPSe"}
|
||||||
} else if ecl_ECL_Matcher_0.Match(i) {
|
} else if ecl_ECL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"ECL"}
|
return []string{"ECL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".es": func(i []byte) []string {
|
".es": func(i []byte) []string {
|
||||||
if es_Erlang_Matcher_0.Match(i) {
|
if es_Erlang_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Erlang"}
|
return []string{"Erlang"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".f": func(i []byte) []string {
|
".f": func(i []byte) []string {
|
||||||
if f_Forth_Matcher_0.Match(i) {
|
if f_Forth_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Forth"}
|
return []string{"Forth"}
|
||||||
} else if f_FilebenchWML_Matcher_0.Match(i) {
|
} else if f_FilebenchWML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Filebench WML"}
|
return []string{"Filebench WML"}
|
||||||
} else if f_Fortran_Matcher_0.Match(i) {
|
} else if f_Fortran_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Fortran"}
|
return []string{"Fortran"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".for": func(i []byte) []string {
|
".for": func(i []byte) []string {
|
||||||
if for_Forth_Matcher_0.Match(i) {
|
if for_Forth_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Forth"}
|
return []string{"Forth"}
|
||||||
} else if for_Fortran_Matcher_0.Match(i) {
|
} else if for_Fortran_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Fortran"}
|
return []string{"Fortran"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".fr": func(i []byte) []string {
|
".fr": func(i []byte) []string {
|
||||||
if fr_Forth_Matcher_0.Match(i) {
|
if fr_Forth_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Forth"}
|
return []string{"Forth"}
|
||||||
} else if fr_Frege_Matcher_0.Match(i) {
|
} else if fr_Frege_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Frege"}
|
return []string{"Frege"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Text"}
|
return []string{"Text"}
|
||||||
},
|
},
|
||||||
".fs": func(i []byte) []string {
|
".fs": func(i []byte) []string {
|
||||||
if fs_Forth_Matcher_0.Match(i) {
|
if fs_Forth_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Forth"}
|
return []string{"Forth"}
|
||||||
} else if fs_FSharp_Matcher_0.Match(i) {
|
} else if fs_FSharp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"F#"}
|
return []string{"F#"}
|
||||||
} else if fs_GLSL_Matcher_0.Match(i) {
|
} else if fs_GLSL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"GLSL"}
|
return []string{"GLSL"}
|
||||||
} else if fs_Filterscript_Matcher_0.Match(i) {
|
} else if fs_Filterscript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Filterscript"}
|
return []string{"Filterscript"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".gs": func(i []byte) []string {
|
".gs": func(i []byte) []string {
|
||||||
if gs_Gosu_Matcher_0.Match(i) {
|
if gs_Gosu_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Gosu"}
|
return []string{"Gosu"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".h": func(i []byte) []string {
|
".h": func(i []byte) []string {
|
||||||
if h_ObjectiveDashC_Matcher_0.Match(i) {
|
if h_ObjectiveDashC_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Objective-C"}
|
return []string{"Objective-C"}
|
||||||
} else if h_CPlusPlus_Matcher_0.Match(i) || h_CPlusPlus_Matcher_1.Match(i) || h_CPlusPlus_Matcher_2.Match(i) || h_CPlusPlus_Matcher_3.Match(i) || h_CPlusPlus_Matcher_4.Match(i) || h_CPlusPlus_Matcher_5.Match(i) || h_CPlusPlus_Matcher_6.Match(i) {
|
} else if h_CPlusPlus_Matcher_0.Match(string(i)) || h_CPlusPlus_Matcher_1.Match(string(i)) || h_CPlusPlus_Matcher_2.Match(string(i)) || h_CPlusPlus_Matcher_3.Match(string(i)) || h_CPlusPlus_Matcher_4.Match(string(i)) || h_CPlusPlus_Matcher_5.Match(string(i)) || h_CPlusPlus_Matcher_6.Match(string(i)) {
|
||||||
return []string{"C++"}
|
return []string{"C++"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".inc": func(i []byte) []string {
|
".inc": func(i []byte) []string {
|
||||||
if inc_PHP_Matcher_0.Match(i) {
|
if inc_PHP_Matcher_0.Match(string(i)) {
|
||||||
return []string{"PHP"}
|
return []string{"PHP"}
|
||||||
} else if inc_POVDashRaySDL_Matcher_0.Match(i) {
|
} else if inc_POVDashRaySDL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"POV-Ray SDL"}
|
return []string{"POV-Ray SDL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".l": func(i []byte) []string {
|
".l": func(i []byte) []string {
|
||||||
if l_CommonLisp_Matcher_0.Match(i) {
|
if l_CommonLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Common Lisp"}
|
return []string{"Common Lisp"}
|
||||||
} else if l_Lex_Matcher_0.Match(i) {
|
} else if l_Lex_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Lex"}
|
return []string{"Lex"}
|
||||||
} else if l_Roff_Matcher_0.Match(i) {
|
} else if l_Roff_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Roff"}
|
return []string{"Roff"}
|
||||||
} else if l_PicoLisp_Matcher_0.Match(i) {
|
} else if l_PicoLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"PicoLisp"}
|
return []string{"PicoLisp"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".ls": func(i []byte) []string {
|
".ls": func(i []byte) []string {
|
||||||
if ls_LoomScript_Matcher_0.Match(i) {
|
if ls_LoomScript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"LoomScript"}
|
return []string{"LoomScript"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"LiveScript"}
|
return []string{"LiveScript"}
|
||||||
},
|
},
|
||||||
".lsp": func(i []byte) []string {
|
".lsp": func(i []byte) []string {
|
||||||
if lsp_CommonLisp_Matcher_0.Match(i) {
|
if lsp_CommonLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Common Lisp"}
|
return []string{"Common Lisp"}
|
||||||
} else if lsp_NewLisp_Matcher_0.Match(i) {
|
} else if lsp_NewLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"NewLisp"}
|
return []string{"NewLisp"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".lisp": func(i []byte) []string {
|
".lisp": func(i []byte) []string {
|
||||||
if lisp_CommonLisp_Matcher_0.Match(i) {
|
if lisp_CommonLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Common Lisp"}
|
return []string{"Common Lisp"}
|
||||||
} else if lisp_NewLisp_Matcher_0.Match(i) {
|
} else if lisp_NewLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"NewLisp"}
|
return []string{"NewLisp"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".m": func(i []byte) []string {
|
".m": func(i []byte) []string {
|
||||||
if m_ObjectiveDashC_Matcher_0.Match(i) {
|
if m_ObjectiveDashC_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Objective-C"}
|
return []string{"Objective-C"}
|
||||||
} else if m_Mercury_Matcher_0.Match(i) {
|
} else if m_Mercury_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Mercury"}
|
return []string{"Mercury"}
|
||||||
} else if m_MUF_Matcher_0.Match(i) {
|
} else if m_MUF_Matcher_0.Match(string(i)) {
|
||||||
return []string{"MUF"}
|
return []string{"MUF"}
|
||||||
} else if m_M_Matcher_0.Match(i) {
|
} else if m_M_Matcher_0.Match(string(i)) {
|
||||||
return []string{"M"}
|
return []string{"M"}
|
||||||
} else if m_Mathematica_Matcher_0.Match(i) {
|
} else if m_Mathematica_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Mathematica"}
|
return []string{"Mathematica"}
|
||||||
} else if m_Matlab_Matcher_0.Match(i) {
|
} else if m_Matlab_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Matlab"}
|
return []string{"Matlab"}
|
||||||
} else if m_Limbo_Matcher_0.Match(i) {
|
} else if m_Limbo_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Limbo"}
|
return []string{"Limbo"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".md": func(i []byte) []string {
|
".md": func(i []byte) []string {
|
||||||
if md_Markdown_Matcher_0.Match(i) || md_Markdown_Matcher_1.Match(i) {
|
if md_Markdown_Matcher_0.Match(string(i)) || md_Markdown_Matcher_1.Match(string(i)) {
|
||||||
return []string{"Markdown"}
|
return []string{"Markdown"}
|
||||||
} else if md_GCCMachineDescription_Matcher_0.Match(i) {
|
} else if md_GCCMachineDescription_Matcher_0.Match(string(i)) {
|
||||||
return []string{"GCC Machine Description"}
|
return []string{"GCC Machine Description"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Markdown"}
|
return []string{"Markdown"}
|
||||||
},
|
},
|
||||||
".ml": func(i []byte) []string {
|
".ml": func(i []byte) []string {
|
||||||
if ml_OCaml_Matcher_0.Match(i) {
|
if ml_OCaml_Matcher_0.Match(string(i)) {
|
||||||
return []string{"OCaml"}
|
return []string{"OCaml"}
|
||||||
} else if ml_StandardML_Matcher_0.Match(i) {
|
} else if ml_StandardML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Standard ML"}
|
return []string{"Standard ML"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".mod": func(i []byte) []string {
|
".mod": func(i []byte) []string {
|
||||||
if mod_XML_Matcher_0.Match(i) {
|
if mod_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
} else if mod_ModulaDash2_Matcher_0.Match(i) || mod_ModulaDash2_Matcher_1.Match(i) {
|
} else if mod_ModulaDash2_Matcher_0.Match(string(i)) || mod_ModulaDash2_Matcher_1.Match(string(i)) {
|
||||||
return []string{"Modula-2"}
|
return []string{"Modula-2"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Linux Kernel Module", "AMPL"}
|
return []string{"Linux Kernel Module", "AMPL"}
|
||||||
},
|
},
|
||||||
".ms": func(i []byte) []string {
|
".ms": func(i []byte) []string {
|
||||||
if ms_Roff_Matcher_0.Match(i) {
|
if ms_Roff_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Roff"}
|
return []string{"Roff"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"MAXScript"}
|
return []string{"MAXScript"}
|
||||||
},
|
},
|
||||||
".n": func(i []byte) []string {
|
".n": func(i []byte) []string {
|
||||||
if n_Roff_Matcher_0.Match(i) {
|
if n_Roff_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Roff"}
|
return []string{"Roff"}
|
||||||
} else if n_Nemerle_Matcher_0.Match(i) {
|
} else if n_Nemerle_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Nemerle"}
|
return []string{"Nemerle"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".ncl": func(i []byte) []string {
|
".ncl": func(i []byte) []string {
|
||||||
if ncl_Text_Matcher_0.Match(i) {
|
if ncl_Text_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Text"}
|
return []string{"Text"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".nl": func(i []byte) []string {
|
".nl": func(i []byte) []string {
|
||||||
if nl_NL_Matcher_0.Match(i) {
|
if nl_NL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"NL"}
|
return []string{"NL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"NewLisp"}
|
return []string{"NewLisp"}
|
||||||
},
|
},
|
||||||
".php": func(i []byte) []string {
|
".php": func(i []byte) []string {
|
||||||
if php_Hack_Matcher_0.Match(i) {
|
if php_Hack_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Hack"}
|
return []string{"Hack"}
|
||||||
} else if php_PHP_Matcher_0.Match(i) {
|
} else if php_PHP_Matcher_0.Match(string(i)) {
|
||||||
return []string{"PHP"}
|
return []string{"PHP"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".pl": func(i []byte) []string {
|
".pl": func(i []byte) []string {
|
||||||
if pl_Prolog_Matcher_0.Match(i) {
|
if pl_Prolog_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Prolog"}
|
return []string{"Prolog"}
|
||||||
} else if pl_Perl_Matcher_0.Match(i) {
|
} else if pl_Perl_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl"}
|
return []string{"Perl"}
|
||||||
} else if pl_Perl6_Matcher_0.Match(i) {
|
} else if pl_Perl6_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl 6"}
|
return []string{"Perl 6"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".pm": func(i []byte) []string {
|
".pm": func(i []byte) []string {
|
||||||
if pm_Perl6_Matcher_0.Match(i) {
|
if pm_Perl6_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl 6"}
|
return []string{"Perl 6"}
|
||||||
} else if pm_Perl_Matcher_0.Match(i) {
|
} else if pm_Perl_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl"}
|
return []string{"Perl"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".pod": func(i []byte) []string {
|
".pod": func(i []byte) []string {
|
||||||
if pod_Pod_Matcher_0.Match(i) {
|
if pod_Pod_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Pod"}
|
return []string{"Pod"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Perl"}
|
return []string{"Perl"}
|
||||||
},
|
},
|
||||||
".pro": func(i []byte) []string {
|
".pro": func(i []byte) []string {
|
||||||
if pro_Prolog_Matcher_0.Match(i) {
|
if pro_Prolog_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Prolog"}
|
return []string{"Prolog"}
|
||||||
} else if pro_INI_Matcher_0.Match(i) {
|
} else if pro_INI_Matcher_0.Match(string(i)) {
|
||||||
return []string{"INI"}
|
return []string{"INI"}
|
||||||
} else if pro_QMake_Matcher_0.Match(i) && pro_QMake_Matcher_1.Match(i) {
|
} else if pro_QMake_Matcher_0.Match(string(i)) && pro_QMake_Matcher_1.Match(string(i)) {
|
||||||
return []string{"QMake"}
|
return []string{"QMake"}
|
||||||
} else if pro_IDL_Matcher_0.Match(i) {
|
} else if pro_IDL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"IDL"}
|
return []string{"IDL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".props": func(i []byte) []string {
|
".props": func(i []byte) []string {
|
||||||
if props_XML_Matcher_0.Match(i) {
|
if props_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
} else if props_INI_Matcher_0.Match(i) {
|
} else if props_INI_Matcher_0.Match(string(i)) {
|
||||||
return []string{"INI"}
|
return []string{"INI"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".r": func(i []byte) []string {
|
".r": func(i []byte) []string {
|
||||||
if r_Rebol_Matcher_0.Match(i) {
|
if r_Rebol_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Rebol"}
|
return []string{"Rebol"}
|
||||||
} else if r_R_Matcher_0.Match(i) {
|
} else if r_R_Matcher_0.Match(string(i)) {
|
||||||
return []string{"R"}
|
return []string{"R"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".rno": func(i []byte) []string {
|
".rno": func(i []byte) []string {
|
||||||
if rno_RUNOFF_Matcher_0.Match(i) {
|
if rno_RUNOFF_Matcher_0.Match(string(i)) {
|
||||||
return []string{"RUNOFF"}
|
return []string{"RUNOFF"}
|
||||||
} else if rno_Roff_Matcher_0.Match(i) {
|
} else if rno_Roff_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Roff"}
|
return []string{"Roff"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".rpy": func(i []byte) []string {
|
".rpy": func(i []byte) []string {
|
||||||
if rpy_Python_Matcher_0.Match(i) {
|
if rpy_Python_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Python"}
|
return []string{"Python"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Ren'Py"}
|
return []string{"Ren'Py"}
|
||||||
},
|
},
|
||||||
".rs": func(i []byte) []string {
|
".rs": func(i []byte) []string {
|
||||||
if rs_Rust_Matcher_0.Match(i) {
|
if rs_Rust_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Rust"}
|
return []string{"Rust"}
|
||||||
} else if rs_RenderScript_Matcher_0.Match(i) {
|
} else if rs_RenderScript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"RenderScript"}
|
return []string{"RenderScript"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".sc": func(i []byte) []string {
|
".sc": func(i []byte) []string {
|
||||||
if sc_SuperCollider_Matcher_0.Match(i) || sc_SuperCollider_Matcher_1.Match(i) || sc_SuperCollider_Matcher_2.Match(i) {
|
if sc_SuperCollider_Matcher_0.Match(string(i)) || sc_SuperCollider_Matcher_1.Match(string(i)) || sc_SuperCollider_Matcher_2.Match(string(i)) {
|
||||||
return []string{"SuperCollider"}
|
return []string{"SuperCollider"}
|
||||||
} else if sc_Scala_Matcher_0.Match(i) || sc_Scala_Matcher_1.Match(i) || sc_Scala_Matcher_2.Match(i) {
|
} else if sc_Scala_Matcher_0.Match(string(i)) || sc_Scala_Matcher_1.Match(string(i)) || sc_Scala_Matcher_2.Match(string(i)) {
|
||||||
return []string{"Scala"}
|
return []string{"Scala"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".sql": func(i []byte) []string {
|
".sql": func(i []byte) []string {
|
||||||
if sql_PLpgSQL_Matcher_0.Match(i) || sql_PLpgSQL_Matcher_1.Match(i) || sql_PLpgSQL_Matcher_2.Match(i) {
|
if sql_PLpgSQL_Matcher_0.Match(string(i)) || sql_PLpgSQL_Matcher_1.Match(string(i)) || sql_PLpgSQL_Matcher_2.Match(string(i)) {
|
||||||
return []string{"PLpgSQL"}
|
return []string{"PLpgSQL"}
|
||||||
} else if sql_SQLPL_Matcher_0.Match(i) || sql_SQLPL_Matcher_1.Match(i) {
|
} else if sql_SQLPL_Matcher_0.Match(string(i)) || sql_SQLPL_Matcher_1.Match(string(i)) {
|
||||||
return []string{"SQLPL"}
|
return []string{"SQLPL"}
|
||||||
} else if sql_PLSQL_Matcher_0.Match(i) || sql_PLSQL_Matcher_1.Match(i) {
|
} else if sql_PLSQL_Matcher_0.Match(string(i)) || sql_PLSQL_Matcher_1.Match(string(i)) {
|
||||||
return []string{"PLSQL"}
|
return []string{"PLSQL"}
|
||||||
} else if sql_SQL_Matcher_0.Match(i) {
|
} else if sql_SQL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"SQL"}
|
return []string{"SQL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".srt": func(i []byte) []string {
|
".srt": func(i []byte) []string {
|
||||||
if srt_SubRipText_Matcher_0.Match(i) {
|
if srt_SubRipText_Matcher_0.Match(string(i)) {
|
||||||
return []string{"SubRip Text"}
|
return []string{"SubRip Text"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".t": func(i []byte) []string {
|
".t": func(i []byte) []string {
|
||||||
if t_Turing_Matcher_0.Match(i) {
|
if t_Turing_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Turing"}
|
return []string{"Turing"}
|
||||||
} else if t_Perl6_Matcher_0.Match(i) {
|
} else if t_Perl6_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl 6"}
|
return []string{"Perl 6"}
|
||||||
} else if t_Perl_Matcher_0.Match(i) {
|
} else if t_Perl_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl"}
|
return []string{"Perl"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".toc": func(i []byte) []string {
|
".toc": func(i []byte) []string {
|
||||||
if toc_WorldofWarcraftAddonData_Matcher_0.Match(i) {
|
if toc_WorldofWarcraftAddonData_Matcher_0.Match(string(i)) {
|
||||||
return []string{"World of Warcraft Addon Data"}
|
return []string{"World of Warcraft Addon Data"}
|
||||||
} else if toc_TeX_Matcher_0.Match(i) {
|
} else if toc_TeX_Matcher_0.Match(string(i)) {
|
||||||
return []string{"TeX"}
|
return []string{"TeX"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".ts": func(i []byte) []string {
|
".ts": func(i []byte) []string {
|
||||||
if ts_XML_Matcher_0.Match(i) {
|
if ts_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"TypeScript"}
|
return []string{"TypeScript"}
|
||||||
},
|
},
|
||||||
".tst": func(i []byte) []string {
|
".tst": func(i []byte) []string {
|
||||||
if tst_GAP_Matcher_0.Match(i) {
|
if tst_GAP_Matcher_0.Match(string(i)) {
|
||||||
return []string{"GAP"}
|
return []string{"GAP"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Scilab"}
|
return []string{"Scilab"}
|
||||||
},
|
},
|
||||||
".tsx": func(i []byte) []string {
|
".tsx": func(i []byte) []string {
|
||||||
if tsx_TypeScript_Matcher_0.Match(i) {
|
if tsx_TypeScript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"TypeScript"}
|
return []string{"TypeScript"}
|
||||||
} else if tsx_XML_Matcher_0.Match(i) {
|
} else if tsx_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -447,122 +445,122 @@ var ContentMatchers = map[string]languageMatcher{
|
|||||||
}
|
}
|
||||||
|
|
||||||
var (
|
var (
|
||||||
asc_PublicKey_Matcher_0 = regexp.MustCompile(`(?m)^(----[- ]BEGIN|ssh-(rsa|dss)) `)
|
asc_PublicKey_Matcher_0 = substring.Regexp(`(?m)^(----[- ]BEGIN|ssh-(rsa|dss)) `)
|
||||||
asc_AsciiDoc_Matcher_0 = regexp.MustCompile(`(?m)^[=-]+(\s|\n)|{{[A-Za-z]`)
|
asc_AsciiDoc_Matcher_0 = substring.Regexp(`(?m)^[=-]+(\s|\n)|{{[A-Za-z]`)
|
||||||
asc_AGSScript_Matcher_0 = regexp.MustCompile(`(?m)^(\/\/.+|((import|export)\s+)?(function|int|float|char)\s+((room|repeatedly|on|game)_)?([A-Za-z]+[A-Za-z_0-9]+)\s*[;\(])`)
|
asc_AGSScript_Matcher_0 = substring.Regexp(`(?m)^(\/\/.+|((import|export)\s+)?(function|int|float|char)\s+((room|repeatedly|on|game)_)?([A-Za-z]+[A-Za-z_0-9]+)\s*[;\(])`)
|
||||||
bb_BlitzBasic_Matcher_0 = regexp.MustCompile(`(?m)^\s*; `)
|
bb_BlitzBasic_Matcher_0 = substring.Regexp(`(?m)^\s*; `)
|
||||||
bb_BlitzBasic_Matcher_1 = regexp.MustCompile(`(?m)End Function`)
|
bb_BlitzBasic_Matcher_1 = substring.Regexp(`(?m)End Function`)
|
||||||
bb_BitBake_Matcher_0 = regexp.MustCompile(`(?m)^\s*(# |include|require)\b`)
|
bb_BitBake_Matcher_0 = substring.Regexp(`(?m)^\s*(# |include|require)\b`)
|
||||||
builds_XML_Matcher_0 = regexp.MustCompile(`(?mi)^(\s*)(<Project|<Import|<Property|<?xml|xmlns)`)
|
builds_XML_Matcher_0 = substring.Regexp(`(?mi)^(\s*)(<Project|<Import|<Property|<?xml|xmlns)`)
|
||||||
ch_xBase_Matcher_0 = regexp.MustCompile(`(?mi)^\s*#\s*(if|ifdef|ifndef|define|command|xcommand|translate|xtranslate|include|pragma|undef)\b`)
|
ch_xBase_Matcher_0 = substring.Regexp(`(?mi)^\s*#\s*(if|ifdef|ifndef|define|command|xcommand|translate|xtranslate|include|pragma|undef)\b`)
|
||||||
cl_CommonLisp_Matcher_0 = regexp.MustCompile(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
cl_CommonLisp_Matcher_0 = substring.Regexp(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
||||||
cl_Cool_Matcher_0 = regexp.MustCompile(`(?m)^class`)
|
cl_Cool_Matcher_0 = substring.Regexp(`(?m)^class`)
|
||||||
cl_OpenCL_Matcher_0 = regexp.MustCompile(`(?m)\/\* |\/\/ |^\}`)
|
cl_OpenCL_Matcher_0 = substring.Regexp(`(?m)\/\* |\/\/ |^\}`)
|
||||||
cls_TeX_Matcher_0 = regexp.MustCompile(`(?m)\\\w+{`)
|
cls_TeX_Matcher_0 = substring.Regexp(`(?m)\\\w+{`)
|
||||||
cs_Smalltalk_Matcher_0 = regexp.MustCompile(`(?m)![\w\s]+methodsFor: `)
|
cs_Smalltalk_Matcher_0 = substring.Regexp(`(?m)![\w\s]+methodsFor: `)
|
||||||
cs_CSharp_Matcher_0 = regexp.MustCompile(`(?m)^\s*namespace\s*[\w\.]+\s*{`)
|
cs_CSharp_Matcher_0 = substring.Regexp(`(?m)^\s*namespace\s*[\w\.]+\s*{`)
|
||||||
cs_CSharp_Matcher_1 = regexp.MustCompile(`(?m)^\s*\/\/`)
|
cs_CSharp_Matcher_1 = substring.Regexp(`(?m)^\s*\/\/`)
|
||||||
d_D_Matcher_0 = regexp.MustCompile(`(?m)^module\s+[\w.]*\s*;|import\s+[\w\s,.:]*;|\w+\s+\w+\s*\(.*\)(?:\(.*\))?\s*{[^}]*}|unittest\s*(?:\(.*\))?\s*{[^}]*}`)
|
d_D_Matcher_0 = substring.Regexp(`(?m)^module\s+[\w.]*\s*;|import\s+[\w\s,.:]*;|\w+\s+\w+\s*\(.*\)(?:\(.*\))?\s*{[^}]*}|unittest\s*(?:\(.*\))?\s*{[^}]*}`)
|
||||||
d_DTrace_Matcher_0 = regexp.MustCompile(`(?m)^(\w+:\w*:\w*:\w*|BEGIN|END|provider\s+|(tick|profile)-\w+\s+{[^}]*}|#pragma\s+D\s+(option|attributes|depends_on)\s|#pragma\s+ident\s)`)
|
d_DTrace_Matcher_0 = substring.Regexp(`(?m)^(\w+:\w*:\w*:\w*|BEGIN|END|provider\s+|(tick|profile)-\w+\s+{[^}]*}|#pragma\s+D\s+(option|attributes|depends_on)\s|#pragma\s+ident\s)`)
|
||||||
d_Makefile_Matcher_0 = regexp.MustCompile(`(?m)([\/\\].*:\s+.*\s\\$|: \\$|^ : |^[\w\s\/\\.]+\w+\.\w+\s*:\s+[\w\s\/\\.]+\w+\.\w+)`)
|
d_Makefile_Matcher_0 = substring.Regexp(`(?m)([\/\\].*:\s+.*\s\\$|: \\$|^ : |^[\w\s\/\\.]+\w+\.\w+\s*:\s+[\w\s\/\\.]+\w+\.\w+)`)
|
||||||
ecl_ECLiPSe_Matcher_0 = regexp.MustCompile(`(?m)^[^#]+:-`)
|
ecl_ECLiPSe_Matcher_0 = substring.Regexp(`(?m)^[^#]+:-`)
|
||||||
ecl_ECL_Matcher_0 = regexp.MustCompile(`(?m):=`)
|
ecl_ECL_Matcher_0 = substring.Regexp(`(?m):=`)
|
||||||
es_Erlang_Matcher_0 = regexp.MustCompile(`(?m)^\s*(?:%%|main\s*\(.*?\)\s*->)`)
|
es_Erlang_Matcher_0 = substring.Regexp(`(?m)^\s*(?:%%|main\s*\(.*?\)\s*->)`)
|
||||||
f_Forth_Matcher_0 = regexp.MustCompile(`(?m)^: `)
|
f_Forth_Matcher_0 = substring.Regexp(`(?m)^: `)
|
||||||
f_FilebenchWML_Matcher_0 = regexp.MustCompile(`(?m)flowop`)
|
f_FilebenchWML_Matcher_0 = substring.Regexp(`(?m)flowop`)
|
||||||
f_Fortran_Matcher_0 = regexp.MustCompile(`(?mi)^([c*][^abd-z]| (subroutine|program|end|data)\s|\s*!)`)
|
f_Fortran_Matcher_0 = substring.Regexp(`(?mi)^([c*][^abd-z]| (subroutine|program|end|data)\s|\s*!)`)
|
||||||
for_Forth_Matcher_0 = regexp.MustCompile(`(?m)^: `)
|
for_Forth_Matcher_0 = substring.Regexp(`(?m)^: `)
|
||||||
for_Fortran_Matcher_0 = regexp.MustCompile(`(?mi)^([c*][^abd-z]| (subroutine|program|end|data)\s|\s*!)`)
|
for_Fortran_Matcher_0 = substring.Regexp(`(?mi)^([c*][^abd-z]| (subroutine|program|end|data)\s|\s*!)`)
|
||||||
fr_Forth_Matcher_0 = regexp.MustCompile(`(?m)^(: |also |new-device|previous )`)
|
fr_Forth_Matcher_0 = substring.Regexp(`(?m)^(: |also |new-device|previous )`)
|
||||||
fr_Frege_Matcher_0 = regexp.MustCompile(`(?m)^\s*(import|module|package|data|type) `)
|
fr_Frege_Matcher_0 = substring.Regexp(`(?m)^\s*(import|module|package|data|type) `)
|
||||||
fs_Forth_Matcher_0 = regexp.MustCompile(`(?m)^(: |new-device)`)
|
fs_Forth_Matcher_0 = substring.Regexp(`(?m)^(: |new-device)`)
|
||||||
fs_FSharp_Matcher_0 = regexp.MustCompile(`(?m)^\s*(#light|import|let|module|namespace|open|type)`)
|
fs_FSharp_Matcher_0 = substring.Regexp(`(?m)^\s*(#light|import|let|module|namespace|open|type)`)
|
||||||
fs_GLSL_Matcher_0 = regexp.MustCompile(`(?m)^\s*(#version|precision|uniform|varying|vec[234])`)
|
fs_GLSL_Matcher_0 = substring.Regexp(`(?m)^\s*(#version|precision|uniform|varying|vec[234])`)
|
||||||
fs_Filterscript_Matcher_0 = regexp.MustCompile(`(?m)#include|#pragma\s+(rs|version)|__attribute__`)
|
fs_Filterscript_Matcher_0 = substring.Regexp(`(?m)#include|#pragma\s+(rs|version)|__attribute__`)
|
||||||
gs_Gosu_Matcher_0 = regexp.MustCompile(`(?m)^uses java\.`)
|
gs_Gosu_Matcher_0 = substring.Regexp(`(?m)^uses java\.`)
|
||||||
h_ObjectiveDashC_Matcher_0 = regexp.MustCompile(`(?m)^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])`)
|
h_ObjectiveDashC_Matcher_0 = substring.Regexp(`(?m)^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])`)
|
||||||
h_CPlusPlus_Matcher_0 = regexp.MustCompile(`(?m)^\s*#\s*include <(cstdint|string|vector|map|list|array|bitset|queue|stack|forward_list|unordered_map|unordered_set|(i|o|io)stream)>`)
|
h_CPlusPlus_Matcher_0 = substring.Regexp(`(?m)^\s*#\s*include <(cstdint|string|vector|map|list|array|bitset|queue|stack|forward_list|unordered_map|unordered_set|(i|o|io)stream)>`)
|
||||||
h_CPlusPlus_Matcher_1 = regexp.MustCompile(`(?m)^\s*template\s*<`)
|
h_CPlusPlus_Matcher_1 = substring.Regexp(`(?m)^\s*template\s*<`)
|
||||||
h_CPlusPlus_Matcher_2 = regexp.MustCompile(`(?m)^[ \t]*try`)
|
h_CPlusPlus_Matcher_2 = substring.Regexp(`(?m)^[ \t]*try`)
|
||||||
h_CPlusPlus_Matcher_3 = regexp.MustCompile(`(?m)^[ \t]*catch\s*\(`)
|
h_CPlusPlus_Matcher_3 = substring.Regexp(`(?m)^[ \t]*catch\s*\(`)
|
||||||
h_CPlusPlus_Matcher_4 = regexp.MustCompile(`(?m)^[ \t]*(class|(using[ \t]+)?namespace)\s+\w+`)
|
h_CPlusPlus_Matcher_4 = substring.Regexp(`(?m)^[ \t]*(class|(using[ \t]+)?namespace)\s+\w+`)
|
||||||
h_CPlusPlus_Matcher_5 = regexp.MustCompile(`(?m)^[ \t]*(private|public|protected):$`)
|
h_CPlusPlus_Matcher_5 = substring.Regexp(`(?m)^[ \t]*(private|public|protected):$`)
|
||||||
h_CPlusPlus_Matcher_6 = regexp.MustCompile(`(?m)std::\w+`)
|
h_CPlusPlus_Matcher_6 = substring.Regexp(`(?m)std::\w+`)
|
||||||
inc_PHP_Matcher_0 = regexp.MustCompile(`(?m)^<\?(?:php)?`)
|
inc_PHP_Matcher_0 = substring.Regexp(`(?m)^<\?(?:php)?`)
|
||||||
inc_POVDashRaySDL_Matcher_0 = regexp.MustCompile(`(?m)^\s*#(declare|local|macro|while)\s`)
|
inc_POVDashRaySDL_Matcher_0 = substring.Regexp(`(?m)^\s*#(declare|local|macro|while)\s`)
|
||||||
l_CommonLisp_Matcher_0 = regexp.MustCompile(`(?m)\(def(un|macro)\s`)
|
l_CommonLisp_Matcher_0 = substring.Regexp(`(?m)\(def(un|macro)\s`)
|
||||||
l_Lex_Matcher_0 = regexp.MustCompile(`(?m)^(%[%{}]xs|<.*>)`)
|
l_Lex_Matcher_0 = substring.Regexp(`(?m)^(%[%{}]xs|<.*>)`)
|
||||||
l_Roff_Matcher_0 = regexp.MustCompile(`(?mi)^\.[a-z][a-z](\s|$)`)
|
l_Roff_Matcher_0 = substring.Regexp(`(?mi)^\.[a-z][a-z](\s|$)`)
|
||||||
l_PicoLisp_Matcher_0 = regexp.MustCompile(`(?m)^\((de|class|rel|code|data|must)\s`)
|
l_PicoLisp_Matcher_0 = substring.Regexp(`(?m)^\((de|class|rel|code|data|must)\s`)
|
||||||
ls_LoomScript_Matcher_0 = regexp.MustCompile(`(?m)^\s*package\s*[\w\.\/\*\s]*\s*{`)
|
ls_LoomScript_Matcher_0 = substring.Regexp(`(?m)^\s*package\s*[\w\.\/\*\s]*\s*{`)
|
||||||
lsp_CommonLisp_Matcher_0 = regexp.MustCompile(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
lsp_CommonLisp_Matcher_0 = substring.Regexp(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
||||||
lsp_NewLisp_Matcher_0 = regexp.MustCompile(`(?m)^\s*\(define `)
|
lsp_NewLisp_Matcher_0 = substring.Regexp(`(?m)^\s*\(define `)
|
||||||
lisp_CommonLisp_Matcher_0 = regexp.MustCompile(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
lisp_CommonLisp_Matcher_0 = substring.Regexp(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
||||||
lisp_NewLisp_Matcher_0 = regexp.MustCompile(`(?m)^\s*\(define `)
|
lisp_NewLisp_Matcher_0 = substring.Regexp(`(?m)^\s*\(define `)
|
||||||
m_ObjectiveDashC_Matcher_0 = regexp.MustCompile(`(?m)^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])`)
|
m_ObjectiveDashC_Matcher_0 = substring.Regexp(`(?m)^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])`)
|
||||||
m_Mercury_Matcher_0 = regexp.MustCompile(`(?m):- module`)
|
m_Mercury_Matcher_0 = substring.Regexp(`(?m):- module`)
|
||||||
m_MUF_Matcher_0 = regexp.MustCompile(`(?m)^: `)
|
m_MUF_Matcher_0 = substring.Regexp(`(?m)^: `)
|
||||||
m_M_Matcher_0 = regexp.MustCompile(`(?m)^\s*;`)
|
m_M_Matcher_0 = substring.Regexp(`(?m)^\s*;`)
|
||||||
m_Mathematica_Matcher_0 = regexp.MustCompile(`(?m)\*\)$`)
|
m_Mathematica_Matcher_0 = substring.Regexp(`(?m)\*\)$`)
|
||||||
m_Matlab_Matcher_0 = regexp.MustCompile(`(?m)^\s*%`)
|
m_Matlab_Matcher_0 = substring.Regexp(`(?m)^\s*%`)
|
||||||
m_Limbo_Matcher_0 = regexp.MustCompile(`(?m)^\w+\s*:\s*module\s*{`)
|
m_Limbo_Matcher_0 = substring.Regexp(`(?m)^\w+\s*:\s*module\s*{`)
|
||||||
md_Markdown_Matcher_0 = regexp.MustCompile(`(?mi)(^[-a-z0-9=#!\*\[|>])|<\/`)
|
md_Markdown_Matcher_0 = substring.Regexp(`(?mi)(^[-a-z0-9=#!\*\[|>])|<\/`)
|
||||||
md_Markdown_Matcher_1 = regexp.MustCompile(`^$`)
|
md_Markdown_Matcher_1 = substring.Regexp(`^$`)
|
||||||
md_GCCMachineDescription_Matcher_0 = regexp.MustCompile(`(?m)^(;;|\(define_)`)
|
md_GCCMachineDescription_Matcher_0 = substring.Regexp(`(?m)^(;;|\(define_)`)
|
||||||
ml_OCaml_Matcher_0 = regexp.MustCompile(`(?m)(^\s*module)|let rec |match\s+(\S+\s)+with`)
|
ml_OCaml_Matcher_0 = substring.Regexp(`(?m)(^\s*module)|let rec |match\s+(\S+\s)+with`)
|
||||||
ml_StandardML_Matcher_0 = regexp.MustCompile(`(?m)=> |case\s+(\S+\s)+of`)
|
ml_StandardML_Matcher_0 = substring.Regexp(`(?m)=> |case\s+(\S+\s)+of`)
|
||||||
mod_XML_Matcher_0 = regexp.MustCompile(`(?m)<!ENTITY `)
|
mod_XML_Matcher_0 = substring.Regexp(`(?m)<!ENTITY `)
|
||||||
mod_ModulaDash2_Matcher_0 = regexp.MustCompile(`(?mi)^\s*MODULE [\w\.]+;`)
|
mod_ModulaDash2_Matcher_0 = substring.Regexp(`(?mi)^\s*MODULE [\w\.]+;`)
|
||||||
mod_ModulaDash2_Matcher_1 = regexp.MustCompile(`(?mi)^\s*END [\w\.]+;`)
|
mod_ModulaDash2_Matcher_1 = substring.Regexp(`(?mi)^\s*END [\w\.]+;`)
|
||||||
ms_Roff_Matcher_0 = regexp.MustCompile(`(?mi)^[.'][a-z][a-z](\s|$)`)
|
ms_Roff_Matcher_0 = substring.Regexp(`(?mi)^[.'][a-z][a-z](\s|$)`)
|
||||||
n_Roff_Matcher_0 = regexp.MustCompile(`(?m)^[.']`)
|
n_Roff_Matcher_0 = substring.Regexp(`(?m)^[.']`)
|
||||||
n_Nemerle_Matcher_0 = regexp.MustCompile(`(?m)^(module|namespace|using)\s`)
|
n_Nemerle_Matcher_0 = substring.Regexp(`(?m)^(module|namespace|using)\s`)
|
||||||
ncl_Text_Matcher_0 = regexp.MustCompile(`(?m)THE_TITLE`)
|
ncl_Text_Matcher_0 = substring.Regexp(`(?m)THE_TITLE`)
|
||||||
nl_NL_Matcher_0 = regexp.MustCompile(`(?m)^(b|g)[0-9]+ `)
|
nl_NL_Matcher_0 = substring.Regexp(`(?m)^(b|g)[0-9]+ `)
|
||||||
php_Hack_Matcher_0 = regexp.MustCompile(`(?m)<\?hh`)
|
php_Hack_Matcher_0 = substring.Regexp(`(?m)<\?hh`)
|
||||||
php_PHP_Matcher_0 = regexp.MustCompile(`(?m)<?[^h]`)
|
php_PHP_Matcher_0 = substring.Regexp(`(?m)<?[^h]`)
|
||||||
pl_Prolog_Matcher_0 = regexp.MustCompile(`(?m)^[^#]*:-`)
|
pl_Prolog_Matcher_0 = substring.Regexp(`(?m)^[^#]*:-`)
|
||||||
pl_Perl_Matcher_0 = regexp.MustCompile(`(?m)use strict|use\s+v?5\.`)
|
pl_Perl_Matcher_0 = substring.Regexp(`(?m)use strict|use\s+v?5\.`)
|
||||||
pl_Perl6_Matcher_0 = regexp.MustCompile(`(?m)^(use v6|(my )?class|module)`)
|
pl_Perl6_Matcher_0 = substring.Regexp(`(?m)^(use v6|(my )?class|module)`)
|
||||||
pm_Perl6_Matcher_0 = regexp.MustCompile(`(?m)^\s*(?:use\s+v6\s*;|(?:\bmy\s+)?class|module)\b`)
|
pm_Perl6_Matcher_0 = substring.Regexp(`(?m)^\s*(?:use\s+v6\s*;|(?:\bmy\s+)?class|module)\b`)
|
||||||
pm_Perl_Matcher_0 = regexp.MustCompile(`(?m)\buse\s+(?:strict\b|v?5\.)`)
|
pm_Perl_Matcher_0 = substring.Regexp(`(?m)\buse\s+(?:strict\b|v?5\.)`)
|
||||||
pod_Pod_Matcher_0 = regexp.MustCompile(`(?m)^=\w+\b`)
|
pod_Pod_Matcher_0 = substring.Regexp(`(?m)^=\w+\b`)
|
||||||
pro_Prolog_Matcher_0 = regexp.MustCompile(`(?m)^[^#]+:-`)
|
pro_Prolog_Matcher_0 = substring.Regexp(`(?m)^[^#]+:-`)
|
||||||
pro_INI_Matcher_0 = regexp.MustCompile(`(?m)last_client=`)
|
pro_INI_Matcher_0 = substring.Regexp(`(?m)last_client=`)
|
||||||
pro_QMake_Matcher_0 = regexp.MustCompile(`(?m)HEADERS`)
|
pro_QMake_Matcher_0 = substring.Regexp(`(?m)HEADERS`)
|
||||||
pro_QMake_Matcher_1 = regexp.MustCompile(`(?m)SOURCES`)
|
pro_QMake_Matcher_1 = substring.Regexp(`(?m)SOURCES`)
|
||||||
pro_IDL_Matcher_0 = regexp.MustCompile(`(?m)^\s*function[ \w,]+$`)
|
pro_IDL_Matcher_0 = substring.Regexp(`(?m)^\s*function[ \w,]+$`)
|
||||||
props_XML_Matcher_0 = regexp.MustCompile(`(?mi)^(\s*)(<Project|<Import|<Property|<?xml|xmlns)`)
|
props_XML_Matcher_0 = substring.Regexp(`(?mi)^(\s*)(<Project|<Import|<Property|<?xml|xmlns)`)
|
||||||
props_INI_Matcher_0 = regexp.MustCompile(`(?mi)\w+\s*=\s*`)
|
props_INI_Matcher_0 = substring.Regexp(`(?mi)\w+\s*=\s*`)
|
||||||
r_Rebol_Matcher_0 = regexp.MustCompile(`(?mi)\bRebol\b`)
|
r_Rebol_Matcher_0 = substring.Regexp(`(?mi)\bRebol\b`)
|
||||||
r_R_Matcher_0 = regexp.MustCompile(`(?m)<-|^\s*#`)
|
r_R_Matcher_0 = substring.Regexp(`(?m)<-|^\s*#`)
|
||||||
rno_RUNOFF_Matcher_0 = regexp.MustCompile(`(?mi)^\.!|^\.end lit(?:eral)?\b`)
|
rno_RUNOFF_Matcher_0 = substring.Regexp(`(?mi)^\.!|^\.end lit(?:eral)?\b`)
|
||||||
rno_Roff_Matcher_0 = regexp.MustCompile(`(?m)^\.\\" `)
|
rno_Roff_Matcher_0 = substring.Regexp(`(?m)^\.\\" `)
|
||||||
rpy_Python_Matcher_0 = regexp.MustCompile(`(?ms)(^(import|from|class|def)\s)`)
|
rpy_Python_Matcher_0 = substring.Regexp(`(?ms)(^(import|from|class|def)\s)`)
|
||||||
rs_Rust_Matcher_0 = regexp.MustCompile(`(?m)^(use |fn |mod |pub |macro_rules|impl|#!?\[)`)
|
rs_Rust_Matcher_0 = substring.Regexp(`(?m)^(use |fn |mod |pub |macro_rules|impl|#!?\[)`)
|
||||||
rs_RenderScript_Matcher_0 = regexp.MustCompile(`(?m)#include|#pragma\s+(rs|version)|__attribute__`)
|
rs_RenderScript_Matcher_0 = substring.Regexp(`(?m)#include|#pragma\s+(rs|version)|__attribute__`)
|
||||||
sc_SuperCollider_Matcher_0 = regexp.MustCompile(`(?m)\^(this|super)\.`)
|
sc_SuperCollider_Matcher_0 = substring.Regexp(`(?m)\^(this|super)\.`)
|
||||||
sc_SuperCollider_Matcher_1 = regexp.MustCompile(`(?m)^\s*(\+|\*)\s*\w+\s*{`)
|
sc_SuperCollider_Matcher_1 = substring.Regexp(`(?m)^\s*(\+|\*)\s*\w+\s*{`)
|
||||||
sc_SuperCollider_Matcher_2 = regexp.MustCompile(`(?m)^\s*~\w+\s*=\.`)
|
sc_SuperCollider_Matcher_2 = substring.Regexp(`(?m)^\s*~\w+\s*=\.`)
|
||||||
sc_Scala_Matcher_0 = regexp.MustCompile(`(?m)^\s*import (scala|java)\.`)
|
sc_Scala_Matcher_0 = substring.Regexp(`(?m)^\s*import (scala|java)\.`)
|
||||||
sc_Scala_Matcher_1 = regexp.MustCompile(`(?m)^\s*val\s+\w+\s*=`)
|
sc_Scala_Matcher_1 = substring.Regexp(`(?m)^\s*val\s+\w+\s*=`)
|
||||||
sc_Scala_Matcher_2 = regexp.MustCompile(`(?m)^\s*class\b`)
|
sc_Scala_Matcher_2 = substring.Regexp(`(?m)^\s*class\b`)
|
||||||
sql_PLpgSQL_Matcher_0 = regexp.MustCompile(`(?mi)^\\i\b|AS \$\$|LANGUAGE '?plpgsql'?`)
|
sql_PLpgSQL_Matcher_0 = substring.Regexp(`(?mi)^\\i\b|AS \$\$|LANGUAGE '?plpgsql'?`)
|
||||||
sql_PLpgSQL_Matcher_1 = regexp.MustCompile(`(?mi)SECURITY (DEFINER|INVOKER)`)
|
sql_PLpgSQL_Matcher_1 = substring.Regexp(`(?mi)SECURITY (DEFINER|INVOKER)`)
|
||||||
sql_PLpgSQL_Matcher_2 = regexp.MustCompile(`(?mi)BEGIN( WORK| TRANSACTION)?;`)
|
sql_PLpgSQL_Matcher_2 = substring.Regexp(`(?mi)BEGIN( WORK| TRANSACTION)?;`)
|
||||||
sql_SQLPL_Matcher_0 = regexp.MustCompile(`(?mi)(alter module)|(language sql)|(begin( NOT)+ atomic)`)
|
sql_SQLPL_Matcher_0 = substring.Regexp(`(?mi)(alter module)|(language sql)|(begin( NOT)+ atomic)`)
|
||||||
sql_SQLPL_Matcher_1 = regexp.MustCompile(`(?mi)signal SQLSTATE '[0-9]+'`)
|
sql_SQLPL_Matcher_1 = substring.Regexp(`(?mi)signal SQLSTATE '[0-9]+'`)
|
||||||
sql_PLSQL_Matcher_0 = regexp.MustCompile(`(?mi)\$\$PLSQL_|XMLTYPE|sysdate|systimestamp|\.nextval|connect by|AUTHID (DEFINER|CURRENT_USER)`)
|
sql_PLSQL_Matcher_0 = substring.Regexp(`(?mi)\$\$PLSQL_|XMLTYPE|sysdate|systimestamp|\.nextval|connect by|AUTHID (DEFINER|CURRENT_USER)`)
|
||||||
sql_PLSQL_Matcher_1 = regexp.MustCompile(`(?mi)constructor\W+function`)
|
sql_PLSQL_Matcher_1 = substring.Regexp(`(?mi)constructor\W+function`)
|
||||||
sql_SQL_Matcher_0 = regexp.MustCompile(`(?mi)! /begin|boolean|package|exception`)
|
sql_SQL_Matcher_0 = substring.Regexp(`(?mi)! /begin|boolean|package|exception`)
|
||||||
srt_SubRipText_Matcher_0 = regexp.MustCompile(`(?m)^(\d{2}:\d{2}:\d{2},\d{3})\s*(-->)\s*(\d{2}:\d{2}:\d{2},\d{3})$`)
|
srt_SubRipText_Matcher_0 = substring.Regexp(`(?m)^(\d{2}:\d{2}:\d{2},\d{3})\s*(-->)\s*(\d{2}:\d{2}:\d{2},\d{3})$`)
|
||||||
t_Turing_Matcher_0 = regexp.MustCompile(`(?m)^\s*%[ \t]+|^\s*var\s+\w+\s*:=\s*\w+`)
|
t_Turing_Matcher_0 = substring.Regexp(`(?m)^\s*%[ \t]+|^\s*var\s+\w+\s*:=\s*\w+`)
|
||||||
t_Perl6_Matcher_0 = regexp.MustCompile(`(?m)^\s*(?:use\s+v6\s*;|\bmodule\b|\b(?:my\s+)?class\b)`)
|
t_Perl6_Matcher_0 = substring.Regexp(`(?m)^\s*(?:use\s+v6\s*;|\bmodule\b|\b(?:my\s+)?class\b)`)
|
||||||
t_Perl_Matcher_0 = regexp.MustCompile(`(?m)\buse\s+(?:strict\b|v?5\.)`)
|
t_Perl_Matcher_0 = substring.Regexp(`(?m)\buse\s+(?:strict\b|v?5\.)`)
|
||||||
toc_WorldofWarcraftAddonData_Matcher_0 = regexp.MustCompile(`(?m)^## |@no-lib-strip@`)
|
toc_WorldofWarcraftAddonData_Matcher_0 = substring.Regexp(`(?m)^## |@no-lib-strip@`)
|
||||||
toc_TeX_Matcher_0 = regexp.MustCompile(`(?m)^\\(contentsline|defcounter|beamer|boolfalse)`)
|
toc_TeX_Matcher_0 = substring.Regexp(`(?m)^\\(contentsline|defcounter|beamer|boolfalse)`)
|
||||||
ts_XML_Matcher_0 = regexp.MustCompile(`(?m)<TS`)
|
ts_XML_Matcher_0 = substring.Regexp(`(?m)<TS`)
|
||||||
tst_GAP_Matcher_0 = regexp.MustCompile(`(?m)gap> `)
|
tst_GAP_Matcher_0 = substring.Regexp(`(?m)gap> `)
|
||||||
tsx_TypeScript_Matcher_0 = regexp.MustCompile(`(?m)^\s*(import.+(from\s+|require\()['"]react|\/\/\/\s*<reference\s)`)
|
tsx_TypeScript_Matcher_0 = substring.Regexp(`(?m)^\s*(import.+(from\s+|require\()['"]react|\/\/\/\s*<reference\s)`)
|
||||||
tsx_XML_Matcher_0 = regexp.MustCompile(`(?mi)^\s*<\?xml\s+version`)
|
tsx_XML_Matcher_0 = substring.Regexp(`(?mi)^\s*<\?xml\s+version`)
|
||||||
)
|
)
|
||||||
|
@ -4,9 +4,7 @@ package data
|
|||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: {{ getCommit }}
|
// Extracted from github/linguist commit: {{ getCommit }}
|
||||||
|
|
||||||
import (
|
import "gopkg.in/toqueteos/substring.v1"
|
||||||
"regexp"
|
|
||||||
)
|
|
||||||
|
|
||||||
type languageMatcher func ([]byte) []string
|
type languageMatcher func ([]byte) []string
|
||||||
|
|
||||||
@ -18,7 +16,7 @@ var ContentMatchers = map[string]languageMatcher{
|
|||||||
{{- if not (avoidLanguage $language) }}
|
{{- if not (avoidLanguage $language) }}
|
||||||
{{- if gt (len $language.Heuristics) 0 }}
|
{{- if gt (len $language.Heuristics) 0 }}
|
||||||
{{- if gt $i 0 }} else {{ end -}}
|
{{- if gt $i 0 }} else {{ end -}}
|
||||||
if {{- range $j, $heuristic := $language.Heuristics }} {{ $heuristic.Name }}.Match(i)
|
if {{- range $j, $heuristic := $language.Heuristics }} {{ $heuristic.Name }}.Match(string(i))
|
||||||
{{- if lt $j (len $language.LogicRelations) }} {{index $language.LogicRelations $j}} {{- end -}} {{ end }} {
|
{{- if lt $j (len $language.LogicRelations) }} {{index $language.LogicRelations $j}} {{- end -}} {{ end }} {
|
||||||
return []string{ {{- printf "%q" $language.Language -}} }
|
return []string{ {{- printf "%q" $language.Language -}} }
|
||||||
}
|
}
|
||||||
@ -34,6 +32,6 @@ var ContentMatchers = map[string]languageMatcher{
|
|||||||
|
|
||||||
var (
|
var (
|
||||||
{{ range $index, $heuristic := getAllHeuristics . -}}
|
{{ range $index, $heuristic := getAllHeuristics . -}}
|
||||||
{{ $heuristic.Name }} = regexp.MustCompile(`{{ $heuristic.Regexp }}`)
|
{{ $heuristic.Name }} = substring.Regexp(`{{ $heuristic.Regexp }}`)
|
||||||
{{ end -}}
|
{{ end -}}
|
||||||
)
|
)
|
||||||
|
@ -14,7 +14,7 @@ import (
|
|||||||
|
|
||||||
const (
|
const (
|
||||||
lingustURL = "https://github.com/github/linguist.git"
|
lingustURL = "https://github.com/github/linguist.git"
|
||||||
commit = "b6460f8ed6b249281ada099ca28bd8f1230b8892"
|
commit = "d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68"
|
||||||
samplesDir = "samples"
|
samplesDir = "samples"
|
||||||
languagesFile = "lib/linguist/languages.yml"
|
languagesFile = "lib/linguist/languages.yml"
|
||||||
|
|
||||||
|
@ -2,7 +2,7 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
// LanguagesByAlias keeps alias for different languages and use the name of the languages as an alias too.
|
// LanguagesByAlias keeps alias for different languages and use the name of the languages as an alias too.
|
||||||
// All the keys (alias or not) are written in lower case and the whitespaces has been replaced by underscores.
|
// All the keys (alias or not) are written in lower case and the whitespaces has been replaced by underscores.
|
||||||
@ -163,6 +163,7 @@ var LanguagesByAlias = map[string]string{
|
|||||||
"dylan": "Dylan",
|
"dylan": "Dylan",
|
||||||
"e": "E",
|
"e": "E",
|
||||||
"eagle": "Eagle",
|
"eagle": "Eagle",
|
||||||
|
"easybuild": "Easybuild",
|
||||||
"ebnf": "EBNF",
|
"ebnf": "EBNF",
|
||||||
"ec": "eC",
|
"ec": "eC",
|
||||||
"ecere_projects": "Ecere Projects",
|
"ecere_projects": "Ecere Projects",
|
||||||
@ -438,7 +439,7 @@ var LanguagesByAlias = map[string]string{
|
|||||||
"pawn": "PAWN",
|
"pawn": "PAWN",
|
||||||
"pep8": "Pep8",
|
"pep8": "Pep8",
|
||||||
"perl": "Perl",
|
"perl": "Perl",
|
||||||
"perl6": "Perl6",
|
"perl_6": "Perl 6",
|
||||||
"php": "PHP",
|
"php": "PHP",
|
||||||
"pic": "Pic",
|
"pic": "Pic",
|
||||||
"pickle": "Pickle",
|
"pickle": "Pickle",
|
||||||
@ -508,6 +509,7 @@ var LanguagesByAlias = map[string]string{
|
|||||||
"restructuredtext": "reStructuredText",
|
"restructuredtext": "reStructuredText",
|
||||||
"rexx": "REXX",
|
"rexx": "REXX",
|
||||||
"rhtml": "RHTML",
|
"rhtml": "RHTML",
|
||||||
|
"ring": "Ring",
|
||||||
"rmarkdown": "RMarkdown",
|
"rmarkdown": "RMarkdown",
|
||||||
"robotframework": "RobotFramework",
|
"robotframework": "RobotFramework",
|
||||||
"roff": "Roff",
|
"roff": "Roff",
|
||||||
|
@ -2,7 +2,7 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
// linguist's commit from which files were generated.
|
// linguist's commit from which files were generated.
|
||||||
var LinguistCommit = "b6460f8ed6b249281ada099ca28bd8f1230b8892"
|
var LinguistCommit = "d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68"
|
||||||
|
@ -2,443 +2,441 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
import (
|
import "gopkg.in/toqueteos/substring.v1"
|
||||||
"regexp"
|
|
||||||
)
|
|
||||||
|
|
||||||
type languageMatcher func([]byte) []string
|
type languageMatcher func([]byte) []string
|
||||||
|
|
||||||
var ContentMatchers = map[string]languageMatcher{
|
var ContentMatchers = map[string]languageMatcher{
|
||||||
".asc": func(i []byte) []string {
|
".asc": func(i []byte) []string {
|
||||||
if asc_PublicKey_Matcher_0.Match(i) {
|
if asc_PublicKey_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Public Key"}
|
return []string{"Public Key"}
|
||||||
} else if asc_AsciiDoc_Matcher_0.Match(i) {
|
} else if asc_AsciiDoc_Matcher_0.Match(string(i)) {
|
||||||
return []string{"AsciiDoc"}
|
return []string{"AsciiDoc"}
|
||||||
} else if asc_AGSScript_Matcher_0.Match(i) {
|
} else if asc_AGSScript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"AGS Script"}
|
return []string{"AGS Script"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".bb": func(i []byte) []string {
|
".bb": func(i []byte) []string {
|
||||||
if bb_BlitzBasic_Matcher_0.Match(i) || bb_BlitzBasic_Matcher_1.Match(i) {
|
if bb_BlitzBasic_Matcher_0.Match(string(i)) || bb_BlitzBasic_Matcher_1.Match(string(i)) {
|
||||||
return []string{"BlitzBasic"}
|
return []string{"BlitzBasic"}
|
||||||
} else if bb_BitBake_Matcher_0.Match(i) {
|
} else if bb_BitBake_Matcher_0.Match(string(i)) {
|
||||||
return []string{"BitBake"}
|
return []string{"BitBake"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".builds": func(i []byte) []string {
|
".builds": func(i []byte) []string {
|
||||||
if builds_XML_Matcher_0.Match(i) {
|
if builds_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Text"}
|
return []string{"Text"}
|
||||||
},
|
},
|
||||||
".ch": func(i []byte) []string {
|
".ch": func(i []byte) []string {
|
||||||
if ch_xBase_Matcher_0.Match(i) {
|
if ch_xBase_Matcher_0.Match(string(i)) {
|
||||||
return []string{"xBase"}
|
return []string{"xBase"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".cl": func(i []byte) []string {
|
".cl": func(i []byte) []string {
|
||||||
if cl_CommonLisp_Matcher_0.Match(i) {
|
if cl_CommonLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Common Lisp"}
|
return []string{"Common Lisp"}
|
||||||
} else if cl_Cool_Matcher_0.Match(i) {
|
} else if cl_Cool_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Cool"}
|
return []string{"Cool"}
|
||||||
} else if cl_OpenCL_Matcher_0.Match(i) {
|
} else if cl_OpenCL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"OpenCL"}
|
return []string{"OpenCL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".cls": func(i []byte) []string {
|
".cls": func(i []byte) []string {
|
||||||
if cls_TeX_Matcher_0.Match(i) {
|
if cls_TeX_Matcher_0.Match(string(i)) {
|
||||||
return []string{"TeX"}
|
return []string{"TeX"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".cs": func(i []byte) []string {
|
".cs": func(i []byte) []string {
|
||||||
if cs_Smalltalk_Matcher_0.Match(i) {
|
if cs_Smalltalk_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Smalltalk"}
|
return []string{"Smalltalk"}
|
||||||
} else if cs_CSharp_Matcher_0.Match(i) || cs_CSharp_Matcher_1.Match(i) {
|
} else if cs_CSharp_Matcher_0.Match(string(i)) || cs_CSharp_Matcher_1.Match(string(i)) {
|
||||||
return []string{"C#"}
|
return []string{"C#"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".d": func(i []byte) []string {
|
".d": func(i []byte) []string {
|
||||||
if d_D_Matcher_0.Match(i) {
|
if d_D_Matcher_0.Match(string(i)) {
|
||||||
return []string{"D"}
|
return []string{"D"}
|
||||||
} else if d_DTrace_Matcher_0.Match(i) {
|
} else if d_DTrace_Matcher_0.Match(string(i)) {
|
||||||
return []string{"DTrace"}
|
return []string{"DTrace"}
|
||||||
} else if d_Makefile_Matcher_0.Match(i) {
|
} else if d_Makefile_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Makefile"}
|
return []string{"Makefile"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".ecl": func(i []byte) []string {
|
".ecl": func(i []byte) []string {
|
||||||
if ecl_ECLiPSe_Matcher_0.Match(i) {
|
if ecl_ECLiPSe_Matcher_0.Match(string(i)) {
|
||||||
return []string{"ECLiPSe"}
|
return []string{"ECLiPSe"}
|
||||||
} else if ecl_ECL_Matcher_0.Match(i) {
|
} else if ecl_ECL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"ECL"}
|
return []string{"ECL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".es": func(i []byte) []string {
|
".es": func(i []byte) []string {
|
||||||
if es_Erlang_Matcher_0.Match(i) {
|
if es_Erlang_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Erlang"}
|
return []string{"Erlang"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".f": func(i []byte) []string {
|
".f": func(i []byte) []string {
|
||||||
if f_Forth_Matcher_0.Match(i) {
|
if f_Forth_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Forth"}
|
return []string{"Forth"}
|
||||||
} else if f_FilebenchWML_Matcher_0.Match(i) {
|
} else if f_FilebenchWML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Filebench WML"}
|
return []string{"Filebench WML"}
|
||||||
} else if f_Fortran_Matcher_0.Match(i) {
|
} else if f_Fortran_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Fortran"}
|
return []string{"Fortran"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".for": func(i []byte) []string {
|
".for": func(i []byte) []string {
|
||||||
if for_Forth_Matcher_0.Match(i) {
|
if for_Forth_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Forth"}
|
return []string{"Forth"}
|
||||||
} else if for_Fortran_Matcher_0.Match(i) {
|
} else if for_Fortran_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Fortran"}
|
return []string{"Fortran"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".fr": func(i []byte) []string {
|
".fr": func(i []byte) []string {
|
||||||
if fr_Forth_Matcher_0.Match(i) {
|
if fr_Forth_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Forth"}
|
return []string{"Forth"}
|
||||||
} else if fr_Frege_Matcher_0.Match(i) {
|
} else if fr_Frege_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Frege"}
|
return []string{"Frege"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Text"}
|
return []string{"Text"}
|
||||||
},
|
},
|
||||||
".fs": func(i []byte) []string {
|
".fs": func(i []byte) []string {
|
||||||
if fs_Forth_Matcher_0.Match(i) {
|
if fs_Forth_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Forth"}
|
return []string{"Forth"}
|
||||||
} else if fs_FSharp_Matcher_0.Match(i) {
|
} else if fs_FSharp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"F#"}
|
return []string{"F#"}
|
||||||
} else if fs_GLSL_Matcher_0.Match(i) {
|
} else if fs_GLSL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"GLSL"}
|
return []string{"GLSL"}
|
||||||
} else if fs_Filterscript_Matcher_0.Match(i) {
|
} else if fs_Filterscript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Filterscript"}
|
return []string{"Filterscript"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".gs": func(i []byte) []string {
|
".gs": func(i []byte) []string {
|
||||||
if gs_Gosu_Matcher_0.Match(i) {
|
if gs_Gosu_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Gosu"}
|
return []string{"Gosu"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".h": func(i []byte) []string {
|
".h": func(i []byte) []string {
|
||||||
if h_ObjectiveDashC_Matcher_0.Match(i) {
|
if h_ObjectiveDashC_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Objective-C"}
|
return []string{"Objective-C"}
|
||||||
} else if h_CPlusPlus_Matcher_0.Match(i) || h_CPlusPlus_Matcher_1.Match(i) || h_CPlusPlus_Matcher_2.Match(i) || h_CPlusPlus_Matcher_3.Match(i) || h_CPlusPlus_Matcher_4.Match(i) || h_CPlusPlus_Matcher_5.Match(i) || h_CPlusPlus_Matcher_6.Match(i) {
|
} else if h_CPlusPlus_Matcher_0.Match(string(i)) || h_CPlusPlus_Matcher_1.Match(string(i)) || h_CPlusPlus_Matcher_2.Match(string(i)) || h_CPlusPlus_Matcher_3.Match(string(i)) || h_CPlusPlus_Matcher_4.Match(string(i)) || h_CPlusPlus_Matcher_5.Match(string(i)) || h_CPlusPlus_Matcher_6.Match(string(i)) {
|
||||||
return []string{"C++"}
|
return []string{"C++"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".inc": func(i []byte) []string {
|
".inc": func(i []byte) []string {
|
||||||
if inc_PHP_Matcher_0.Match(i) {
|
if inc_PHP_Matcher_0.Match(string(i)) {
|
||||||
return []string{"PHP"}
|
return []string{"PHP"}
|
||||||
} else if inc_POVDashRaySDL_Matcher_0.Match(i) {
|
} else if inc_POVDashRaySDL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"POV-Ray SDL"}
|
return []string{"POV-Ray SDL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".l": func(i []byte) []string {
|
".l": func(i []byte) []string {
|
||||||
if l_CommonLisp_Matcher_0.Match(i) {
|
if l_CommonLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Common Lisp"}
|
return []string{"Common Lisp"}
|
||||||
} else if l_Lex_Matcher_0.Match(i) {
|
} else if l_Lex_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Lex"}
|
return []string{"Lex"}
|
||||||
} else if l_Roff_Matcher_0.Match(i) {
|
} else if l_Roff_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Roff"}
|
return []string{"Roff"}
|
||||||
} else if l_PicoLisp_Matcher_0.Match(i) {
|
} else if l_PicoLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"PicoLisp"}
|
return []string{"PicoLisp"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".ls": func(i []byte) []string {
|
".ls": func(i []byte) []string {
|
||||||
if ls_LoomScript_Matcher_0.Match(i) {
|
if ls_LoomScript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"LoomScript"}
|
return []string{"LoomScript"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"LiveScript"}
|
return []string{"LiveScript"}
|
||||||
},
|
},
|
||||||
".lsp": func(i []byte) []string {
|
".lsp": func(i []byte) []string {
|
||||||
if lsp_CommonLisp_Matcher_0.Match(i) {
|
if lsp_CommonLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Common Lisp"}
|
return []string{"Common Lisp"}
|
||||||
} else if lsp_NewLisp_Matcher_0.Match(i) {
|
} else if lsp_NewLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"NewLisp"}
|
return []string{"NewLisp"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".lisp": func(i []byte) []string {
|
".lisp": func(i []byte) []string {
|
||||||
if lisp_CommonLisp_Matcher_0.Match(i) {
|
if lisp_CommonLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Common Lisp"}
|
return []string{"Common Lisp"}
|
||||||
} else if lisp_NewLisp_Matcher_0.Match(i) {
|
} else if lisp_NewLisp_Matcher_0.Match(string(i)) {
|
||||||
return []string{"NewLisp"}
|
return []string{"NewLisp"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".m": func(i []byte) []string {
|
".m": func(i []byte) []string {
|
||||||
if m_ObjectiveDashC_Matcher_0.Match(i) {
|
if m_ObjectiveDashC_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Objective-C"}
|
return []string{"Objective-C"}
|
||||||
} else if m_Mercury_Matcher_0.Match(i) {
|
} else if m_Mercury_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Mercury"}
|
return []string{"Mercury"}
|
||||||
} else if m_MUF_Matcher_0.Match(i) {
|
} else if m_MUF_Matcher_0.Match(string(i)) {
|
||||||
return []string{"MUF"}
|
return []string{"MUF"}
|
||||||
} else if m_M_Matcher_0.Match(i) {
|
} else if m_M_Matcher_0.Match(string(i)) {
|
||||||
return []string{"M"}
|
return []string{"M"}
|
||||||
} else if m_Mathematica_Matcher_0.Match(i) {
|
} else if m_Mathematica_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Mathematica"}
|
return []string{"Mathematica"}
|
||||||
} else if m_Matlab_Matcher_0.Match(i) {
|
} else if m_Matlab_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Matlab"}
|
return []string{"Matlab"}
|
||||||
} else if m_Limbo_Matcher_0.Match(i) {
|
} else if m_Limbo_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Limbo"}
|
return []string{"Limbo"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".md": func(i []byte) []string {
|
".md": func(i []byte) []string {
|
||||||
if md_Markdown_Matcher_0.Match(i) || md_Markdown_Matcher_1.Match(i) {
|
if md_Markdown_Matcher_0.Match(string(i)) || md_Markdown_Matcher_1.Match(string(i)) {
|
||||||
return []string{"Markdown"}
|
return []string{"Markdown"}
|
||||||
} else if md_GCCMachineDescription_Matcher_0.Match(i) {
|
} else if md_GCCMachineDescription_Matcher_0.Match(string(i)) {
|
||||||
return []string{"GCC Machine Description"}
|
return []string{"GCC Machine Description"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Markdown"}
|
return []string{"Markdown"}
|
||||||
},
|
},
|
||||||
".ml": func(i []byte) []string {
|
".ml": func(i []byte) []string {
|
||||||
if ml_OCaml_Matcher_0.Match(i) {
|
if ml_OCaml_Matcher_0.Match(string(i)) {
|
||||||
return []string{"OCaml"}
|
return []string{"OCaml"}
|
||||||
} else if ml_StandardML_Matcher_0.Match(i) {
|
} else if ml_StandardML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Standard ML"}
|
return []string{"Standard ML"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".mod": func(i []byte) []string {
|
".mod": func(i []byte) []string {
|
||||||
if mod_XML_Matcher_0.Match(i) {
|
if mod_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
} else if mod_ModulaDash2_Matcher_0.Match(i) || mod_ModulaDash2_Matcher_1.Match(i) {
|
} else if mod_ModulaDash2_Matcher_0.Match(string(i)) || mod_ModulaDash2_Matcher_1.Match(string(i)) {
|
||||||
return []string{"Modula-2"}
|
return []string{"Modula-2"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Linux Kernel Module", "AMPL"}
|
return []string{"Linux Kernel Module", "AMPL"}
|
||||||
},
|
},
|
||||||
".ms": func(i []byte) []string {
|
".ms": func(i []byte) []string {
|
||||||
if ms_Roff_Matcher_0.Match(i) {
|
if ms_Roff_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Roff"}
|
return []string{"Roff"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"MAXScript"}
|
return []string{"MAXScript"}
|
||||||
},
|
},
|
||||||
".n": func(i []byte) []string {
|
".n": func(i []byte) []string {
|
||||||
if n_Roff_Matcher_0.Match(i) {
|
if n_Roff_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Roff"}
|
return []string{"Roff"}
|
||||||
} else if n_Nemerle_Matcher_0.Match(i) {
|
} else if n_Nemerle_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Nemerle"}
|
return []string{"Nemerle"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".ncl": func(i []byte) []string {
|
".ncl": func(i []byte) []string {
|
||||||
if ncl_Text_Matcher_0.Match(i) {
|
if ncl_Text_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Text"}
|
return []string{"Text"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".nl": func(i []byte) []string {
|
".nl": func(i []byte) []string {
|
||||||
if nl_NL_Matcher_0.Match(i) {
|
if nl_NL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"NL"}
|
return []string{"NL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"NewLisp"}
|
return []string{"NewLisp"}
|
||||||
},
|
},
|
||||||
".php": func(i []byte) []string {
|
".php": func(i []byte) []string {
|
||||||
if php_Hack_Matcher_0.Match(i) {
|
if php_Hack_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Hack"}
|
return []string{"Hack"}
|
||||||
} else if php_PHP_Matcher_0.Match(i) {
|
} else if php_PHP_Matcher_0.Match(string(i)) {
|
||||||
return []string{"PHP"}
|
return []string{"PHP"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".pl": func(i []byte) []string {
|
".pl": func(i []byte) []string {
|
||||||
if pl_Prolog_Matcher_0.Match(i) {
|
if pl_Prolog_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Prolog"}
|
return []string{"Prolog"}
|
||||||
} else if pl_Perl_Matcher_0.Match(i) {
|
} else if pl_Perl_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl"}
|
return []string{"Perl"}
|
||||||
} else if pl_Perl6_Matcher_0.Match(i) {
|
} else if pl_Perl6_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl 6"}
|
return []string{"Perl 6"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".pm": func(i []byte) []string {
|
".pm": func(i []byte) []string {
|
||||||
if pm_Perl6_Matcher_0.Match(i) {
|
if pm_Perl6_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl 6"}
|
return []string{"Perl 6"}
|
||||||
} else if pm_Perl_Matcher_0.Match(i) {
|
} else if pm_Perl_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl"}
|
return []string{"Perl"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".pod": func(i []byte) []string {
|
".pod": func(i []byte) []string {
|
||||||
if pod_Pod_Matcher_0.Match(i) {
|
if pod_Pod_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Pod"}
|
return []string{"Pod"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Perl"}
|
return []string{"Perl"}
|
||||||
},
|
},
|
||||||
".pro": func(i []byte) []string {
|
".pro": func(i []byte) []string {
|
||||||
if pro_Prolog_Matcher_0.Match(i) {
|
if pro_Prolog_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Prolog"}
|
return []string{"Prolog"}
|
||||||
} else if pro_INI_Matcher_0.Match(i) {
|
} else if pro_INI_Matcher_0.Match(string(i)) {
|
||||||
return []string{"INI"}
|
return []string{"INI"}
|
||||||
} else if pro_QMake_Matcher_0.Match(i) && pro_QMake_Matcher_1.Match(i) {
|
} else if pro_QMake_Matcher_0.Match(string(i)) && pro_QMake_Matcher_1.Match(string(i)) {
|
||||||
return []string{"QMake"}
|
return []string{"QMake"}
|
||||||
} else if pro_IDL_Matcher_0.Match(i) {
|
} else if pro_IDL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"IDL"}
|
return []string{"IDL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".props": func(i []byte) []string {
|
".props": func(i []byte) []string {
|
||||||
if props_XML_Matcher_0.Match(i) {
|
if props_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
} else if props_INI_Matcher_0.Match(i) {
|
} else if props_INI_Matcher_0.Match(string(i)) {
|
||||||
return []string{"INI"}
|
return []string{"INI"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".r": func(i []byte) []string {
|
".r": func(i []byte) []string {
|
||||||
if r_Rebol_Matcher_0.Match(i) {
|
if r_Rebol_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Rebol"}
|
return []string{"Rebol"}
|
||||||
} else if r_R_Matcher_0.Match(i) {
|
} else if r_R_Matcher_0.Match(string(i)) {
|
||||||
return []string{"R"}
|
return []string{"R"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".rno": func(i []byte) []string {
|
".rno": func(i []byte) []string {
|
||||||
if rno_RUNOFF_Matcher_0.Match(i) {
|
if rno_RUNOFF_Matcher_0.Match(string(i)) {
|
||||||
return []string{"RUNOFF"}
|
return []string{"RUNOFF"}
|
||||||
} else if rno_Roff_Matcher_0.Match(i) {
|
} else if rno_Roff_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Roff"}
|
return []string{"Roff"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".rpy": func(i []byte) []string {
|
".rpy": func(i []byte) []string {
|
||||||
if rpy_Python_Matcher_0.Match(i) {
|
if rpy_Python_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Python"}
|
return []string{"Python"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Ren'Py"}
|
return []string{"Ren'Py"}
|
||||||
},
|
},
|
||||||
".rs": func(i []byte) []string {
|
".rs": func(i []byte) []string {
|
||||||
if rs_Rust_Matcher_0.Match(i) {
|
if rs_Rust_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Rust"}
|
return []string{"Rust"}
|
||||||
} else if rs_RenderScript_Matcher_0.Match(i) {
|
} else if rs_RenderScript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"RenderScript"}
|
return []string{"RenderScript"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".sc": func(i []byte) []string {
|
".sc": func(i []byte) []string {
|
||||||
if sc_SuperCollider_Matcher_0.Match(i) || sc_SuperCollider_Matcher_1.Match(i) || sc_SuperCollider_Matcher_2.Match(i) {
|
if sc_SuperCollider_Matcher_0.Match(string(i)) || sc_SuperCollider_Matcher_1.Match(string(i)) || sc_SuperCollider_Matcher_2.Match(string(i)) {
|
||||||
return []string{"SuperCollider"}
|
return []string{"SuperCollider"}
|
||||||
} else if sc_Scala_Matcher_0.Match(i) || sc_Scala_Matcher_1.Match(i) || sc_Scala_Matcher_2.Match(i) {
|
} else if sc_Scala_Matcher_0.Match(string(i)) || sc_Scala_Matcher_1.Match(string(i)) || sc_Scala_Matcher_2.Match(string(i)) {
|
||||||
return []string{"Scala"}
|
return []string{"Scala"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".sql": func(i []byte) []string {
|
".sql": func(i []byte) []string {
|
||||||
if sql_PLpgSQL_Matcher_0.Match(i) || sql_PLpgSQL_Matcher_1.Match(i) || sql_PLpgSQL_Matcher_2.Match(i) {
|
if sql_PLpgSQL_Matcher_0.Match(string(i)) || sql_PLpgSQL_Matcher_1.Match(string(i)) || sql_PLpgSQL_Matcher_2.Match(string(i)) {
|
||||||
return []string{"PLpgSQL"}
|
return []string{"PLpgSQL"}
|
||||||
} else if sql_SQLPL_Matcher_0.Match(i) || sql_SQLPL_Matcher_1.Match(i) {
|
} else if sql_SQLPL_Matcher_0.Match(string(i)) || sql_SQLPL_Matcher_1.Match(string(i)) {
|
||||||
return []string{"SQLPL"}
|
return []string{"SQLPL"}
|
||||||
} else if sql_PLSQL_Matcher_0.Match(i) || sql_PLSQL_Matcher_1.Match(i) {
|
} else if sql_PLSQL_Matcher_0.Match(string(i)) || sql_PLSQL_Matcher_1.Match(string(i)) {
|
||||||
return []string{"PLSQL"}
|
return []string{"PLSQL"}
|
||||||
} else if sql_SQL_Matcher_0.Match(i) {
|
} else if sql_SQL_Matcher_0.Match(string(i)) {
|
||||||
return []string{"SQL"}
|
return []string{"SQL"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".srt": func(i []byte) []string {
|
".srt": func(i []byte) []string {
|
||||||
if srt_SubRipText_Matcher_0.Match(i) {
|
if srt_SubRipText_Matcher_0.Match(string(i)) {
|
||||||
return []string{"SubRip Text"}
|
return []string{"SubRip Text"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".t": func(i []byte) []string {
|
".t": func(i []byte) []string {
|
||||||
if t_Turing_Matcher_0.Match(i) {
|
if t_Turing_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Turing"}
|
return []string{"Turing"}
|
||||||
} else if t_Perl6_Matcher_0.Match(i) {
|
} else if t_Perl6_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl 6"}
|
return []string{"Perl 6"}
|
||||||
} else if t_Perl_Matcher_0.Match(i) {
|
} else if t_Perl_Matcher_0.Match(string(i)) {
|
||||||
return []string{"Perl"}
|
return []string{"Perl"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".toc": func(i []byte) []string {
|
".toc": func(i []byte) []string {
|
||||||
if toc_WorldofWarcraftAddonData_Matcher_0.Match(i) {
|
if toc_WorldofWarcraftAddonData_Matcher_0.Match(string(i)) {
|
||||||
return []string{"World of Warcraft Addon Data"}
|
return []string{"World of Warcraft Addon Data"}
|
||||||
} else if toc_TeX_Matcher_0.Match(i) {
|
} else if toc_TeX_Matcher_0.Match(string(i)) {
|
||||||
return []string{"TeX"}
|
return []string{"TeX"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
},
|
},
|
||||||
".ts": func(i []byte) []string {
|
".ts": func(i []byte) []string {
|
||||||
if ts_XML_Matcher_0.Match(i) {
|
if ts_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"TypeScript"}
|
return []string{"TypeScript"}
|
||||||
},
|
},
|
||||||
".tst": func(i []byte) []string {
|
".tst": func(i []byte) []string {
|
||||||
if tst_GAP_Matcher_0.Match(i) {
|
if tst_GAP_Matcher_0.Match(string(i)) {
|
||||||
return []string{"GAP"}
|
return []string{"GAP"}
|
||||||
}
|
}
|
||||||
|
|
||||||
return []string{"Scilab"}
|
return []string{"Scilab"}
|
||||||
},
|
},
|
||||||
".tsx": func(i []byte) []string {
|
".tsx": func(i []byte) []string {
|
||||||
if tsx_TypeScript_Matcher_0.Match(i) {
|
if tsx_TypeScript_Matcher_0.Match(string(i)) {
|
||||||
return []string{"TypeScript"}
|
return []string{"TypeScript"}
|
||||||
} else if tsx_XML_Matcher_0.Match(i) {
|
} else if tsx_XML_Matcher_0.Match(string(i)) {
|
||||||
return []string{"XML"}
|
return []string{"XML"}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -447,122 +445,122 @@ var ContentMatchers = map[string]languageMatcher{
|
|||||||
}
|
}
|
||||||
|
|
||||||
var (
|
var (
|
||||||
asc_PublicKey_Matcher_0 = regexp.MustCompile(`(?m)^(----[- ]BEGIN|ssh-(rsa|dss)) `)
|
asc_PublicKey_Matcher_0 = substring.Regexp(`(?m)^(----[- ]BEGIN|ssh-(rsa|dss)) `)
|
||||||
asc_AsciiDoc_Matcher_0 = regexp.MustCompile(`(?m)^[=-]+(\s|\n)|{{[A-Za-z]`)
|
asc_AsciiDoc_Matcher_0 = substring.Regexp(`(?m)^[=-]+(\s|\n)|{{[A-Za-z]`)
|
||||||
asc_AGSScript_Matcher_0 = regexp.MustCompile(`(?m)^(\/\/.+|((import|export)\s+)?(function|int|float|char)\s+((room|repeatedly|on|game)_)?([A-Za-z]+[A-Za-z_0-9]+)\s*[;\(])`)
|
asc_AGSScript_Matcher_0 = substring.Regexp(`(?m)^(\/\/.+|((import|export)\s+)?(function|int|float|char)\s+((room|repeatedly|on|game)_)?([A-Za-z]+[A-Za-z_0-9]+)\s*[;\(])`)
|
||||||
bb_BlitzBasic_Matcher_0 = regexp.MustCompile(`(?m)^\s*; `)
|
bb_BlitzBasic_Matcher_0 = substring.Regexp(`(?m)^\s*; `)
|
||||||
bb_BlitzBasic_Matcher_1 = regexp.MustCompile(`(?m)End Function`)
|
bb_BlitzBasic_Matcher_1 = substring.Regexp(`(?m)End Function`)
|
||||||
bb_BitBake_Matcher_0 = regexp.MustCompile(`(?m)^\s*(# |include|require)\b`)
|
bb_BitBake_Matcher_0 = substring.Regexp(`(?m)^\s*(# |include|require)\b`)
|
||||||
builds_XML_Matcher_0 = regexp.MustCompile(`(?mi)^(\s*)(<Project|<Import|<Property|<?xml|xmlns)`)
|
builds_XML_Matcher_0 = substring.Regexp(`(?mi)^(\s*)(<Project|<Import|<Property|<?xml|xmlns)`)
|
||||||
ch_xBase_Matcher_0 = regexp.MustCompile(`(?mi)^\s*#\s*(if|ifdef|ifndef|define|command|xcommand|translate|xtranslate|include|pragma|undef)\b`)
|
ch_xBase_Matcher_0 = substring.Regexp(`(?mi)^\s*#\s*(if|ifdef|ifndef|define|command|xcommand|translate|xtranslate|include|pragma|undef)\b`)
|
||||||
cl_CommonLisp_Matcher_0 = regexp.MustCompile(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
cl_CommonLisp_Matcher_0 = substring.Regexp(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
||||||
cl_Cool_Matcher_0 = regexp.MustCompile(`(?m)^class`)
|
cl_Cool_Matcher_0 = substring.Regexp(`(?m)^class`)
|
||||||
cl_OpenCL_Matcher_0 = regexp.MustCompile(`(?m)\/\* |\/\/ |^\}`)
|
cl_OpenCL_Matcher_0 = substring.Regexp(`(?m)\/\* |\/\/ |^\}`)
|
||||||
cls_TeX_Matcher_0 = regexp.MustCompile(`(?m)\\\w+{`)
|
cls_TeX_Matcher_0 = substring.Regexp(`(?m)\\\w+{`)
|
||||||
cs_Smalltalk_Matcher_0 = regexp.MustCompile(`(?m)![\w\s]+methodsFor: `)
|
cs_Smalltalk_Matcher_0 = substring.Regexp(`(?m)![\w\s]+methodsFor: `)
|
||||||
cs_CSharp_Matcher_0 = regexp.MustCompile(`(?m)^\s*namespace\s*[\w\.]+\s*{`)
|
cs_CSharp_Matcher_0 = substring.Regexp(`(?m)^\s*namespace\s*[\w\.]+\s*{`)
|
||||||
cs_CSharp_Matcher_1 = regexp.MustCompile(`(?m)^\s*\/\/`)
|
cs_CSharp_Matcher_1 = substring.Regexp(`(?m)^\s*\/\/`)
|
||||||
d_D_Matcher_0 = regexp.MustCompile(`(?m)^module\s+[\w.]*\s*;|import\s+[\w\s,.:]*;|\w+\s+\w+\s*\(.*\)(?:\(.*\))?\s*{[^}]*}|unittest\s*(?:\(.*\))?\s*{[^}]*}`)
|
d_D_Matcher_0 = substring.Regexp(`(?m)^module\s+[\w.]*\s*;|import\s+[\w\s,.:]*;|\w+\s+\w+\s*\(.*\)(?:\(.*\))?\s*{[^}]*}|unittest\s*(?:\(.*\))?\s*{[^}]*}`)
|
||||||
d_DTrace_Matcher_0 = regexp.MustCompile(`(?m)^(\w+:\w*:\w*:\w*|BEGIN|END|provider\s+|(tick|profile)-\w+\s+{[^}]*}|#pragma\s+D\s+(option|attributes|depends_on)\s|#pragma\s+ident\s)`)
|
d_DTrace_Matcher_0 = substring.Regexp(`(?m)^(\w+:\w*:\w*:\w*|BEGIN|END|provider\s+|(tick|profile)-\w+\s+{[^}]*}|#pragma\s+D\s+(option|attributes|depends_on)\s|#pragma\s+ident\s)`)
|
||||||
d_Makefile_Matcher_0 = regexp.MustCompile(`(?m)([\/\\].*:\s+.*\s\\$|: \\$|^ : |^[\w\s\/\\.]+\w+\.\w+\s*:\s+[\w\s\/\\.]+\w+\.\w+)`)
|
d_Makefile_Matcher_0 = substring.Regexp(`(?m)([\/\\].*:\s+.*\s\\$|: \\$|^ : |^[\w\s\/\\.]+\w+\.\w+\s*:\s+[\w\s\/\\.]+\w+\.\w+)`)
|
||||||
ecl_ECLiPSe_Matcher_0 = regexp.MustCompile(`(?m)^[^#]+:-`)
|
ecl_ECLiPSe_Matcher_0 = substring.Regexp(`(?m)^[^#]+:-`)
|
||||||
ecl_ECL_Matcher_0 = regexp.MustCompile(`(?m):=`)
|
ecl_ECL_Matcher_0 = substring.Regexp(`(?m):=`)
|
||||||
es_Erlang_Matcher_0 = regexp.MustCompile(`(?m)^\s*(?:%%|main\s*\(.*?\)\s*->)`)
|
es_Erlang_Matcher_0 = substring.Regexp(`(?m)^\s*(?:%%|main\s*\(.*?\)\s*->)`)
|
||||||
f_Forth_Matcher_0 = regexp.MustCompile(`(?m)^: `)
|
f_Forth_Matcher_0 = substring.Regexp(`(?m)^: `)
|
||||||
f_FilebenchWML_Matcher_0 = regexp.MustCompile(`(?m)flowop`)
|
f_FilebenchWML_Matcher_0 = substring.Regexp(`(?m)flowop`)
|
||||||
f_Fortran_Matcher_0 = regexp.MustCompile(`(?mi)^([c*][^abd-z]| (subroutine|program|end|data)\s|\s*!)`)
|
f_Fortran_Matcher_0 = substring.Regexp(`(?mi)^([c*][^abd-z]| (subroutine|program|end|data)\s|\s*!)`)
|
||||||
for_Forth_Matcher_0 = regexp.MustCompile(`(?m)^: `)
|
for_Forth_Matcher_0 = substring.Regexp(`(?m)^: `)
|
||||||
for_Fortran_Matcher_0 = regexp.MustCompile(`(?mi)^([c*][^abd-z]| (subroutine|program|end|data)\s|\s*!)`)
|
for_Fortran_Matcher_0 = substring.Regexp(`(?mi)^([c*][^abd-z]| (subroutine|program|end|data)\s|\s*!)`)
|
||||||
fr_Forth_Matcher_0 = regexp.MustCompile(`(?m)^(: |also |new-device|previous )`)
|
fr_Forth_Matcher_0 = substring.Regexp(`(?m)^(: |also |new-device|previous )`)
|
||||||
fr_Frege_Matcher_0 = regexp.MustCompile(`(?m)^\s*(import|module|package|data|type) `)
|
fr_Frege_Matcher_0 = substring.Regexp(`(?m)^\s*(import|module|package|data|type) `)
|
||||||
fs_Forth_Matcher_0 = regexp.MustCompile(`(?m)^(: |new-device)`)
|
fs_Forth_Matcher_0 = substring.Regexp(`(?m)^(: |new-device)`)
|
||||||
fs_FSharp_Matcher_0 = regexp.MustCompile(`(?m)^\s*(#light|import|let|module|namespace|open|type)`)
|
fs_FSharp_Matcher_0 = substring.Regexp(`(?m)^\s*(#light|import|let|module|namespace|open|type)`)
|
||||||
fs_GLSL_Matcher_0 = regexp.MustCompile(`(?m)^\s*(#version|precision|uniform|varying|vec[234])`)
|
fs_GLSL_Matcher_0 = substring.Regexp(`(?m)^\s*(#version|precision|uniform|varying|vec[234])`)
|
||||||
fs_Filterscript_Matcher_0 = regexp.MustCompile(`(?m)#include|#pragma\s+(rs|version)|__attribute__`)
|
fs_Filterscript_Matcher_0 = substring.Regexp(`(?m)#include|#pragma\s+(rs|version)|__attribute__`)
|
||||||
gs_Gosu_Matcher_0 = regexp.MustCompile(`(?m)^uses java\.`)
|
gs_Gosu_Matcher_0 = substring.Regexp(`(?m)^uses java\.`)
|
||||||
h_ObjectiveDashC_Matcher_0 = regexp.MustCompile(`(?m)^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])`)
|
h_ObjectiveDashC_Matcher_0 = substring.Regexp(`(?m)^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])`)
|
||||||
h_CPlusPlus_Matcher_0 = regexp.MustCompile(`(?m)^\s*#\s*include <(cstdint|string|vector|map|list|array|bitset|queue|stack|forward_list|unordered_map|unordered_set|(i|o|io)stream)>`)
|
h_CPlusPlus_Matcher_0 = substring.Regexp(`(?m)^\s*#\s*include <(cstdint|string|vector|map|list|array|bitset|queue|stack|forward_list|unordered_map|unordered_set|(i|o|io)stream)>`)
|
||||||
h_CPlusPlus_Matcher_1 = regexp.MustCompile(`(?m)^\s*template\s*<`)
|
h_CPlusPlus_Matcher_1 = substring.Regexp(`(?m)^\s*template\s*<`)
|
||||||
h_CPlusPlus_Matcher_2 = regexp.MustCompile(`(?m)^[ \t]*try`)
|
h_CPlusPlus_Matcher_2 = substring.Regexp(`(?m)^[ \t]*try`)
|
||||||
h_CPlusPlus_Matcher_3 = regexp.MustCompile(`(?m)^[ \t]*catch\s*\(`)
|
h_CPlusPlus_Matcher_3 = substring.Regexp(`(?m)^[ \t]*catch\s*\(`)
|
||||||
h_CPlusPlus_Matcher_4 = regexp.MustCompile(`(?m)^[ \t]*(class|(using[ \t]+)?namespace)\s+\w+`)
|
h_CPlusPlus_Matcher_4 = substring.Regexp(`(?m)^[ \t]*(class|(using[ \t]+)?namespace)\s+\w+`)
|
||||||
h_CPlusPlus_Matcher_5 = regexp.MustCompile(`(?m)^[ \t]*(private|public|protected):$`)
|
h_CPlusPlus_Matcher_5 = substring.Regexp(`(?m)^[ \t]*(private|public|protected):$`)
|
||||||
h_CPlusPlus_Matcher_6 = regexp.MustCompile(`(?m)std::\w+`)
|
h_CPlusPlus_Matcher_6 = substring.Regexp(`(?m)std::\w+`)
|
||||||
inc_PHP_Matcher_0 = regexp.MustCompile(`(?m)^<\?(?:php)?`)
|
inc_PHP_Matcher_0 = substring.Regexp(`(?m)^<\?(?:php)?`)
|
||||||
inc_POVDashRaySDL_Matcher_0 = regexp.MustCompile(`(?m)^\s*#(declare|local|macro|while)\s`)
|
inc_POVDashRaySDL_Matcher_0 = substring.Regexp(`(?m)^\s*#(declare|local|macro|while)\s`)
|
||||||
l_CommonLisp_Matcher_0 = regexp.MustCompile(`(?m)\(def(un|macro)\s`)
|
l_CommonLisp_Matcher_0 = substring.Regexp(`(?m)\(def(un|macro)\s`)
|
||||||
l_Lex_Matcher_0 = regexp.MustCompile(`(?m)^(%[%{}]xs|<.*>)`)
|
l_Lex_Matcher_0 = substring.Regexp(`(?m)^(%[%{}]xs|<.*>)`)
|
||||||
l_Roff_Matcher_0 = regexp.MustCompile(`(?mi)^\.[a-z][a-z](\s|$)`)
|
l_Roff_Matcher_0 = substring.Regexp(`(?mi)^\.[a-z][a-z](\s|$)`)
|
||||||
l_PicoLisp_Matcher_0 = regexp.MustCompile(`(?m)^\((de|class|rel|code|data|must)\s`)
|
l_PicoLisp_Matcher_0 = substring.Regexp(`(?m)^\((de|class|rel|code|data|must)\s`)
|
||||||
ls_LoomScript_Matcher_0 = regexp.MustCompile(`(?m)^\s*package\s*[\w\.\/\*\s]*\s*{`)
|
ls_LoomScript_Matcher_0 = substring.Regexp(`(?m)^\s*package\s*[\w\.\/\*\s]*\s*{`)
|
||||||
lsp_CommonLisp_Matcher_0 = regexp.MustCompile(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
lsp_CommonLisp_Matcher_0 = substring.Regexp(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
||||||
lsp_NewLisp_Matcher_0 = regexp.MustCompile(`(?m)^\s*\(define `)
|
lsp_NewLisp_Matcher_0 = substring.Regexp(`(?m)^\s*\(define `)
|
||||||
lisp_CommonLisp_Matcher_0 = regexp.MustCompile(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
lisp_CommonLisp_Matcher_0 = substring.Regexp(`(?mi)^\s*\((defun|in-package|defpackage) `)
|
||||||
lisp_NewLisp_Matcher_0 = regexp.MustCompile(`(?m)^\s*\(define `)
|
lisp_NewLisp_Matcher_0 = substring.Regexp(`(?m)^\s*\(define `)
|
||||||
m_ObjectiveDashC_Matcher_0 = regexp.MustCompile(`(?m)^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])`)
|
m_ObjectiveDashC_Matcher_0 = substring.Regexp(`(?m)^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])`)
|
||||||
m_Mercury_Matcher_0 = regexp.MustCompile(`(?m):- module`)
|
m_Mercury_Matcher_0 = substring.Regexp(`(?m):- module`)
|
||||||
m_MUF_Matcher_0 = regexp.MustCompile(`(?m)^: `)
|
m_MUF_Matcher_0 = substring.Regexp(`(?m)^: `)
|
||||||
m_M_Matcher_0 = regexp.MustCompile(`(?m)^\s*;`)
|
m_M_Matcher_0 = substring.Regexp(`(?m)^\s*;`)
|
||||||
m_Mathematica_Matcher_0 = regexp.MustCompile(`(?m)\*\)$`)
|
m_Mathematica_Matcher_0 = substring.Regexp(`(?m)\*\)$`)
|
||||||
m_Matlab_Matcher_0 = regexp.MustCompile(`(?m)^\s*%`)
|
m_Matlab_Matcher_0 = substring.Regexp(`(?m)^\s*%`)
|
||||||
m_Limbo_Matcher_0 = regexp.MustCompile(`(?m)^\w+\s*:\s*module\s*{`)
|
m_Limbo_Matcher_0 = substring.Regexp(`(?m)^\w+\s*:\s*module\s*{`)
|
||||||
md_Markdown_Matcher_0 = regexp.MustCompile(`(?mi)(^[-a-z0-9=#!\*\[|>])|<\/`)
|
md_Markdown_Matcher_0 = substring.Regexp(`(?mi)(^[-a-z0-9=#!\*\[|>])|<\/`)
|
||||||
md_Markdown_Matcher_1 = regexp.MustCompile(`^$`)
|
md_Markdown_Matcher_1 = substring.Regexp(`^$`)
|
||||||
md_GCCMachineDescription_Matcher_0 = regexp.MustCompile(`(?m)^(;;|\(define_)`)
|
md_GCCMachineDescription_Matcher_0 = substring.Regexp(`(?m)^(;;|\(define_)`)
|
||||||
ml_OCaml_Matcher_0 = regexp.MustCompile(`(?m)(^\s*module)|let rec |match\s+(\S+\s)+with`)
|
ml_OCaml_Matcher_0 = substring.Regexp(`(?m)(^\s*module)|let rec |match\s+(\S+\s)+with`)
|
||||||
ml_StandardML_Matcher_0 = regexp.MustCompile(`(?m)=> |case\s+(\S+\s)+of`)
|
ml_StandardML_Matcher_0 = substring.Regexp(`(?m)=> |case\s+(\S+\s)+of`)
|
||||||
mod_XML_Matcher_0 = regexp.MustCompile(`(?m)<!ENTITY `)
|
mod_XML_Matcher_0 = substring.Regexp(`(?m)<!ENTITY `)
|
||||||
mod_ModulaDash2_Matcher_0 = regexp.MustCompile(`(?mi)^\s*MODULE [\w\.]+;`)
|
mod_ModulaDash2_Matcher_0 = substring.Regexp(`(?mi)^\s*MODULE [\w\.]+;`)
|
||||||
mod_ModulaDash2_Matcher_1 = regexp.MustCompile(`(?mi)^\s*END [\w\.]+;`)
|
mod_ModulaDash2_Matcher_1 = substring.Regexp(`(?mi)^\s*END [\w\.]+;`)
|
||||||
ms_Roff_Matcher_0 = regexp.MustCompile(`(?mi)^[.'][a-z][a-z](\s|$)`)
|
ms_Roff_Matcher_0 = substring.Regexp(`(?mi)^[.'][a-z][a-z](\s|$)`)
|
||||||
n_Roff_Matcher_0 = regexp.MustCompile(`(?m)^[.']`)
|
n_Roff_Matcher_0 = substring.Regexp(`(?m)^[.']`)
|
||||||
n_Nemerle_Matcher_0 = regexp.MustCompile(`(?m)^(module|namespace|using)\s`)
|
n_Nemerle_Matcher_0 = substring.Regexp(`(?m)^(module|namespace|using)\s`)
|
||||||
ncl_Text_Matcher_0 = regexp.MustCompile(`(?m)THE_TITLE`)
|
ncl_Text_Matcher_0 = substring.Regexp(`(?m)THE_TITLE`)
|
||||||
nl_NL_Matcher_0 = regexp.MustCompile(`(?m)^(b|g)[0-9]+ `)
|
nl_NL_Matcher_0 = substring.Regexp(`(?m)^(b|g)[0-9]+ `)
|
||||||
php_Hack_Matcher_0 = regexp.MustCompile(`(?m)<\?hh`)
|
php_Hack_Matcher_0 = substring.Regexp(`(?m)<\?hh`)
|
||||||
php_PHP_Matcher_0 = regexp.MustCompile(`(?m)<?[^h]`)
|
php_PHP_Matcher_0 = substring.Regexp(`(?m)<?[^h]`)
|
||||||
pl_Prolog_Matcher_0 = regexp.MustCompile(`(?m)^[^#]*:-`)
|
pl_Prolog_Matcher_0 = substring.Regexp(`(?m)^[^#]*:-`)
|
||||||
pl_Perl_Matcher_0 = regexp.MustCompile(`(?m)use strict|use\s+v?5\.`)
|
pl_Perl_Matcher_0 = substring.Regexp(`(?m)use strict|use\s+v?5\.`)
|
||||||
pl_Perl6_Matcher_0 = regexp.MustCompile(`(?m)^(use v6|(my )?class|module)`)
|
pl_Perl6_Matcher_0 = substring.Regexp(`(?m)^(use v6|(my )?class|module)`)
|
||||||
pm_Perl6_Matcher_0 = regexp.MustCompile(`(?m)^\s*(?:use\s+v6\s*;|(?:\bmy\s+)?class|module)\b`)
|
pm_Perl6_Matcher_0 = substring.Regexp(`(?m)^\s*(?:use\s+v6\s*;|(?:\bmy\s+)?class|module)\b`)
|
||||||
pm_Perl_Matcher_0 = regexp.MustCompile(`(?m)\buse\s+(?:strict\b|v?5\.)`)
|
pm_Perl_Matcher_0 = substring.Regexp(`(?m)\buse\s+(?:strict\b|v?5\.)`)
|
||||||
pod_Pod_Matcher_0 = regexp.MustCompile(`(?m)^=\w+\b`)
|
pod_Pod_Matcher_0 = substring.Regexp(`(?m)^=\w+\b`)
|
||||||
pro_Prolog_Matcher_0 = regexp.MustCompile(`(?m)^[^#]+:-`)
|
pro_Prolog_Matcher_0 = substring.Regexp(`(?m)^[^#]+:-`)
|
||||||
pro_INI_Matcher_0 = regexp.MustCompile(`(?m)last_client=`)
|
pro_INI_Matcher_0 = substring.Regexp(`(?m)last_client=`)
|
||||||
pro_QMake_Matcher_0 = regexp.MustCompile(`(?m)HEADERS`)
|
pro_QMake_Matcher_0 = substring.Regexp(`(?m)HEADERS`)
|
||||||
pro_QMake_Matcher_1 = regexp.MustCompile(`(?m)SOURCES`)
|
pro_QMake_Matcher_1 = substring.Regexp(`(?m)SOURCES`)
|
||||||
pro_IDL_Matcher_0 = regexp.MustCompile(`(?m)^\s*function[ \w,]+$`)
|
pro_IDL_Matcher_0 = substring.Regexp(`(?m)^\s*function[ \w,]+$`)
|
||||||
props_XML_Matcher_0 = regexp.MustCompile(`(?mi)^(\s*)(<Project|<Import|<Property|<?xml|xmlns)`)
|
props_XML_Matcher_0 = substring.Regexp(`(?mi)^(\s*)(<Project|<Import|<Property|<?xml|xmlns)`)
|
||||||
props_INI_Matcher_0 = regexp.MustCompile(`(?mi)\w+\s*=\s*`)
|
props_INI_Matcher_0 = substring.Regexp(`(?mi)\w+\s*=\s*`)
|
||||||
r_Rebol_Matcher_0 = regexp.MustCompile(`(?mi)\bRebol\b`)
|
r_Rebol_Matcher_0 = substring.Regexp(`(?mi)\bRebol\b`)
|
||||||
r_R_Matcher_0 = regexp.MustCompile(`(?m)<-|^\s*#`)
|
r_R_Matcher_0 = substring.Regexp(`(?m)<-|^\s*#`)
|
||||||
rno_RUNOFF_Matcher_0 = regexp.MustCompile(`(?mi)^\.!|^\.end lit(?:eral)?\b`)
|
rno_RUNOFF_Matcher_0 = substring.Regexp(`(?mi)^\.!|^\.end lit(?:eral)?\b`)
|
||||||
rno_Roff_Matcher_0 = regexp.MustCompile(`(?m)^\.\\" `)
|
rno_Roff_Matcher_0 = substring.Regexp(`(?m)^\.\\" `)
|
||||||
rpy_Python_Matcher_0 = regexp.MustCompile(`(?ms)(^(import|from|class|def)\s)`)
|
rpy_Python_Matcher_0 = substring.Regexp(`(?ms)(^(import|from|class|def)\s)`)
|
||||||
rs_Rust_Matcher_0 = regexp.MustCompile(`(?m)^(use |fn |mod |pub |macro_rules|impl|#!?\[)`)
|
rs_Rust_Matcher_0 = substring.Regexp(`(?m)^(use |fn |mod |pub |macro_rules|impl|#!?\[)`)
|
||||||
rs_RenderScript_Matcher_0 = regexp.MustCompile(`(?m)#include|#pragma\s+(rs|version)|__attribute__`)
|
rs_RenderScript_Matcher_0 = substring.Regexp(`(?m)#include|#pragma\s+(rs|version)|__attribute__`)
|
||||||
sc_SuperCollider_Matcher_0 = regexp.MustCompile(`(?m)\^(this|super)\.`)
|
sc_SuperCollider_Matcher_0 = substring.Regexp(`(?m)\^(this|super)\.`)
|
||||||
sc_SuperCollider_Matcher_1 = regexp.MustCompile(`(?m)^\s*(\+|\*)\s*\w+\s*{`)
|
sc_SuperCollider_Matcher_1 = substring.Regexp(`(?m)^\s*(\+|\*)\s*\w+\s*{`)
|
||||||
sc_SuperCollider_Matcher_2 = regexp.MustCompile(`(?m)^\s*~\w+\s*=\.`)
|
sc_SuperCollider_Matcher_2 = substring.Regexp(`(?m)^\s*~\w+\s*=\.`)
|
||||||
sc_Scala_Matcher_0 = regexp.MustCompile(`(?m)^\s*import (scala|java)\.`)
|
sc_Scala_Matcher_0 = substring.Regexp(`(?m)^\s*import (scala|java)\.`)
|
||||||
sc_Scala_Matcher_1 = regexp.MustCompile(`(?m)^\s*val\s+\w+\s*=`)
|
sc_Scala_Matcher_1 = substring.Regexp(`(?m)^\s*val\s+\w+\s*=`)
|
||||||
sc_Scala_Matcher_2 = regexp.MustCompile(`(?m)^\s*class\b`)
|
sc_Scala_Matcher_2 = substring.Regexp(`(?m)^\s*class\b`)
|
||||||
sql_PLpgSQL_Matcher_0 = regexp.MustCompile(`(?mi)^\\i\b|AS \$\$|LANGUAGE '?plpgsql'?`)
|
sql_PLpgSQL_Matcher_0 = substring.Regexp(`(?mi)^\\i\b|AS \$\$|LANGUAGE '?plpgsql'?`)
|
||||||
sql_PLpgSQL_Matcher_1 = regexp.MustCompile(`(?mi)SECURITY (DEFINER|INVOKER)`)
|
sql_PLpgSQL_Matcher_1 = substring.Regexp(`(?mi)SECURITY (DEFINER|INVOKER)`)
|
||||||
sql_PLpgSQL_Matcher_2 = regexp.MustCompile(`(?mi)BEGIN( WORK| TRANSACTION)?;`)
|
sql_PLpgSQL_Matcher_2 = substring.Regexp(`(?mi)BEGIN( WORK| TRANSACTION)?;`)
|
||||||
sql_SQLPL_Matcher_0 = regexp.MustCompile(`(?mi)(alter module)|(language sql)|(begin( NOT)+ atomic)`)
|
sql_SQLPL_Matcher_0 = substring.Regexp(`(?mi)(alter module)|(language sql)|(begin( NOT)+ atomic)`)
|
||||||
sql_SQLPL_Matcher_1 = regexp.MustCompile(`(?mi)signal SQLSTATE '[0-9]+'`)
|
sql_SQLPL_Matcher_1 = substring.Regexp(`(?mi)signal SQLSTATE '[0-9]+'`)
|
||||||
sql_PLSQL_Matcher_0 = regexp.MustCompile(`(?mi)\$\$PLSQL_|XMLTYPE|sysdate|systimestamp|\.nextval|connect by|AUTHID (DEFINER|CURRENT_USER)`)
|
sql_PLSQL_Matcher_0 = substring.Regexp(`(?mi)\$\$PLSQL_|XMLTYPE|sysdate|systimestamp|\.nextval|connect by|AUTHID (DEFINER|CURRENT_USER)`)
|
||||||
sql_PLSQL_Matcher_1 = regexp.MustCompile(`(?mi)constructor\W+function`)
|
sql_PLSQL_Matcher_1 = substring.Regexp(`(?mi)constructor\W+function`)
|
||||||
sql_SQL_Matcher_0 = regexp.MustCompile(`(?mi)! /begin|boolean|package|exception`)
|
sql_SQL_Matcher_0 = substring.Regexp(`(?mi)! /begin|boolean|package|exception`)
|
||||||
srt_SubRipText_Matcher_0 = regexp.MustCompile(`(?m)^(\d{2}:\d{2}:\d{2},\d{3})\s*(-->)\s*(\d{2}:\d{2}:\d{2},\d{3})$`)
|
srt_SubRipText_Matcher_0 = substring.Regexp(`(?m)^(\d{2}:\d{2}:\d{2},\d{3})\s*(-->)\s*(\d{2}:\d{2}:\d{2},\d{3})$`)
|
||||||
t_Turing_Matcher_0 = regexp.MustCompile(`(?m)^\s*%[ \t]+|^\s*var\s+\w+\s*:=\s*\w+`)
|
t_Turing_Matcher_0 = substring.Regexp(`(?m)^\s*%[ \t]+|^\s*var\s+\w+\s*:=\s*\w+`)
|
||||||
t_Perl6_Matcher_0 = regexp.MustCompile(`(?m)^\s*(?:use\s+v6\s*;|\bmodule\b|\b(?:my\s+)?class\b)`)
|
t_Perl6_Matcher_0 = substring.Regexp(`(?m)^\s*(?:use\s+v6\s*;|\bmodule\b|\b(?:my\s+)?class\b)`)
|
||||||
t_Perl_Matcher_0 = regexp.MustCompile(`(?m)\buse\s+(?:strict\b|v?5\.)`)
|
t_Perl_Matcher_0 = substring.Regexp(`(?m)\buse\s+(?:strict\b|v?5\.)`)
|
||||||
toc_WorldofWarcraftAddonData_Matcher_0 = regexp.MustCompile(`(?m)^## |@no-lib-strip@`)
|
toc_WorldofWarcraftAddonData_Matcher_0 = substring.Regexp(`(?m)^## |@no-lib-strip@`)
|
||||||
toc_TeX_Matcher_0 = regexp.MustCompile(`(?m)^\\(contentsline|defcounter|beamer|boolfalse)`)
|
toc_TeX_Matcher_0 = substring.Regexp(`(?m)^\\(contentsline|defcounter|beamer|boolfalse)`)
|
||||||
ts_XML_Matcher_0 = regexp.MustCompile(`(?m)<TS`)
|
ts_XML_Matcher_0 = substring.Regexp(`(?m)<TS`)
|
||||||
tst_GAP_Matcher_0 = regexp.MustCompile(`(?m)gap> `)
|
tst_GAP_Matcher_0 = substring.Regexp(`(?m)gap> `)
|
||||||
tsx_TypeScript_Matcher_0 = regexp.MustCompile(`(?m)^\s*(import.+(from\s+|require\()['"]react|\/\/\/\s*<reference\s)`)
|
tsx_TypeScript_Matcher_0 = substring.Regexp(`(?m)^\s*(import.+(from\s+|require\()['"]react|\/\/\/\s*<reference\s)`)
|
||||||
tsx_XML_Matcher_0 = regexp.MustCompile(`(?mi)^\s*<\?xml\s+version`)
|
tsx_XML_Matcher_0 = substring.Regexp(`(?mi)^\s*<\?xml\s+version`)
|
||||||
)
|
)
|
||||||
|
@ -2,7 +2,7 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
import "gopkg.in/toqueteos/substring.v1"
|
import "gopkg.in/toqueteos/substring.v1"
|
||||||
|
|
||||||
|
@ -2,7 +2,7 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
var LanguagesByExtension = map[string][]string{
|
var LanguagesByExtension = map[string][]string{
|
||||||
".1": {"Roff"},
|
".1": {"Roff"},
|
||||||
@ -227,6 +227,7 @@ var LanguagesByExtension = map[string][]string{
|
|||||||
".dylan": {"Dylan"},
|
".dylan": {"Dylan"},
|
||||||
".e": {"E", "Eiffel"},
|
".e": {"E", "Eiffel"},
|
||||||
".eam.fs": {"Formatted"},
|
".eam.fs": {"Formatted"},
|
||||||
|
".eb": {"Easybuild"},
|
||||||
".ebnf": {"EBNF"},
|
".ebnf": {"EBNF"},
|
||||||
".ebuild": {"Gentoo Ebuild"},
|
".ebuild": {"Gentoo Ebuild"},
|
||||||
".ec": {"eC"},
|
".ec": {"eC"},
|
||||||
@ -723,6 +724,7 @@ var LanguagesByExtension = map[string][]string{
|
|||||||
".rexx": {"REXX"},
|
".rexx": {"REXX"},
|
||||||
".rg": {"Rouge"},
|
".rg": {"Rouge"},
|
||||||
".rhtml": {"RHTML"},
|
".rhtml": {"RHTML"},
|
||||||
|
".ring": {"Ring"},
|
||||||
".rkt": {"Racket"},
|
".rkt": {"Racket"},
|
||||||
".rktd": {"Racket"},
|
".rktd": {"Racket"},
|
||||||
".rktl": {"Racket"},
|
".rktl": {"Racket"},
|
||||||
@ -1104,6 +1106,7 @@ var ExtensionsByLanguage = map[string][]string{
|
|||||||
"EJS": {".ejs"},
|
"EJS": {".ejs"},
|
||||||
"EQ": {".eq"},
|
"EQ": {".eq"},
|
||||||
"Eagle": {".sch", ".brd"},
|
"Eagle": {".sch", ".brd"},
|
||||||
|
"Easybuild": {".eb"},
|
||||||
"Ecere Projects": {".epj"},
|
"Ecere Projects": {".epj"},
|
||||||
"Eiffel": {".e"},
|
"Eiffel": {".e"},
|
||||||
"Elixir": {".ex", ".exs"},
|
"Elixir": {".ex", ".exs"},
|
||||||
@ -1341,6 +1344,7 @@ var ExtensionsByLanguage = map[string][]string{
|
|||||||
"Regular Expression": {".regexp", ".regex"},
|
"Regular Expression": {".regexp", ".regex"},
|
||||||
"Ren'Py": {".rpy"},
|
"Ren'Py": {".rpy"},
|
||||||
"RenderScript": {".rs", ".rsh"},
|
"RenderScript": {".rs", ".rsh"},
|
||||||
|
"Ring": {".ring"},
|
||||||
"RobotFramework": {".robot"},
|
"RobotFramework": {".robot"},
|
||||||
"Roff": {".man", ".1", ".1in", ".1m", ".1x", ".2", ".3", ".3in", ".3m", ".3qt", ".3x", ".4", ".5", ".6", ".7", ".8", ".9", ".l", ".me", ".ms", ".n", ".nr", ".rno", ".roff", ".tmac"},
|
"Roff": {".man", ".1", ".1in", ".1m", ".1x", ".2", ".3", ".3in", ".3m", ".3qt", ".3x", ".4", ".5", ".6", ".7", ".8", ".9", ".l", ".me", ".ms", ".n", ".nr", ".rno", ".roff", ".tmac"},
|
||||||
"Rouge": {".rg"},
|
"Rouge": {".rg"},
|
||||||
|
@ -2,7 +2,7 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
var LanguagesByFilename = map[string][]string{
|
var LanguagesByFilename = map[string][]string{
|
||||||
".Rprofile": {"R"},
|
".Rprofile": {"R"},
|
||||||
|
File diff suppressed because it is too large
Load Diff
@ -2,7 +2,7 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
var LanguagesByInterpreter = map[string][]string{
|
var LanguagesByInterpreter = map[string][]string{
|
||||||
"Rscript": {"R"},
|
"Rscript": {"R"},
|
||||||
|
@ -2,7 +2,7 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
var LanguagesType = map[string]int{
|
var LanguagesType = map[string]int{
|
||||||
"1C Enterprise": 2,
|
"1C Enterprise": 2,
|
||||||
@ -107,6 +107,7 @@ var LanguagesType = map[string]int{
|
|||||||
"EJS": 3,
|
"EJS": 3,
|
||||||
"EQ": 2,
|
"EQ": 2,
|
||||||
"Eagle": 3,
|
"Eagle": 3,
|
||||||
|
"Easybuild": 1,
|
||||||
"Ecere Projects": 1,
|
"Ecere Projects": 1,
|
||||||
"Eiffel": 2,
|
"Eiffel": 2,
|
||||||
"Elixir": 2,
|
"Elixir": 2,
|
||||||
@ -349,6 +350,7 @@ var LanguagesType = map[string]int{
|
|||||||
"Regular Expression": 1,
|
"Regular Expression": 1,
|
||||||
"Ren'Py": 2,
|
"Ren'Py": 2,
|
||||||
"RenderScript": 2,
|
"RenderScript": 2,
|
||||||
|
"Ring": 2,
|
||||||
"RobotFramework": 2,
|
"RobotFramework": 2,
|
||||||
"Roff": 3,
|
"Roff": 3,
|
||||||
"Rouge": 2,
|
"Rouge": 2,
|
||||||
|
@ -2,7 +2,7 @@ package data
|
|||||||
|
|
||||||
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
// Extracted from github/linguist commit: b6460f8ed6b249281ada099ca28bd8f1230b8892
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
import "gopkg.in/toqueteos/substring.v1"
|
import "gopkg.in/toqueteos/substring.v1"
|
||||||
|
|
||||||
@ -35,6 +35,7 @@ var VendorMatchers = substring.Or(
|
|||||||
substring.Regexp(`(^|/)font-awesome\.(css|less|scss|styl)$`),
|
substring.Regexp(`(^|/)font-awesome\.(css|less|scss|styl)$`),
|
||||||
substring.Regexp(`(^|/)foundation\.(css|less|scss|styl)$`),
|
substring.Regexp(`(^|/)foundation\.(css|less|scss|styl)$`),
|
||||||
substring.Regexp(`(^|/)normalize\.(css|less|scss|styl)$`),
|
substring.Regexp(`(^|/)normalize\.(css|less|scss|styl)$`),
|
||||||
|
substring.Regexp(`(^|/)skeleton\.(css|less|scss|styl)$`),
|
||||||
substring.Regexp(`(^|/)[Bb]ourbon/.*\.(css|less|scss|styl)$`),
|
substring.Regexp(`(^|/)[Bb]ourbon/.*\.(css|less|scss|styl)$`),
|
||||||
substring.Regexp(`(^|/)animate\.(css|less|scss|styl)$`),
|
substring.Regexp(`(^|/)animate\.(css|less|scss|styl)$`),
|
||||||
substring.Regexp(`third[-_]?party/`),
|
substring.Regexp(`third[-_]?party/`),
|
||||||
|
461
type.go
Normal file
461
type.go
Normal file
@ -0,0 +1,461 @@
|
|||||||
|
package enry
|
||||||
|
|
||||||
|
// CODE GENERATED AUTOMATICALLY WITH gopkg.in/src-d/enry.v1/internal/code-generator
|
||||||
|
// THIS FILE SHOULD NOT BE EDITED BY HAND
|
||||||
|
// Extracted from github/linguist commit: d5c8db3fb91963c4b2762ca2ea2ff7cfac109f68
|
||||||
|
|
||||||
|
var languagesType = map[string]Type{
|
||||||
|
"1C Enterprise": Programming,
|
||||||
|
"ABAP": Programming,
|
||||||
|
"ABNF": Data,
|
||||||
|
"AGS Script": Programming,
|
||||||
|
"AMPL": Programming,
|
||||||
|
"ANTLR": Programming,
|
||||||
|
"API Blueprint": Markup,
|
||||||
|
"APL": Programming,
|
||||||
|
"ASN.1": Data,
|
||||||
|
"ASP": Programming,
|
||||||
|
"ATS": Programming,
|
||||||
|
"ActionScript": Programming,
|
||||||
|
"Ada": Programming,
|
||||||
|
"Agda": Programming,
|
||||||
|
"Alloy": Programming,
|
||||||
|
"Alpine Abuild": Programming,
|
||||||
|
"Ant Build System": Data,
|
||||||
|
"ApacheConf": Markup,
|
||||||
|
"Apex": Programming,
|
||||||
|
"Apollo Guidance Computer": Programming,
|
||||||
|
"AppleScript": Programming,
|
||||||
|
"Arc": Programming,
|
||||||
|
"Arduino": Programming,
|
||||||
|
"AsciiDoc": Prose,
|
||||||
|
"AspectJ": Programming,
|
||||||
|
"Assembly": Programming,
|
||||||
|
"Augeas": Programming,
|
||||||
|
"AutoHotkey": Programming,
|
||||||
|
"AutoIt": Programming,
|
||||||
|
"Awk": Programming,
|
||||||
|
"Batchfile": Programming,
|
||||||
|
"Befunge": Programming,
|
||||||
|
"Bison": Programming,
|
||||||
|
"BitBake": Programming,
|
||||||
|
"Blade": Markup,
|
||||||
|
"BlitzBasic": Programming,
|
||||||
|
"BlitzMax": Programming,
|
||||||
|
"Bluespec": Programming,
|
||||||
|
"Boo": Programming,
|
||||||
|
"Brainfuck": Programming,
|
||||||
|
"Brightscript": Programming,
|
||||||
|
"Bro": Programming,
|
||||||
|
"C": Programming,
|
||||||
|
"C#": Programming,
|
||||||
|
"C++": Programming,
|
||||||
|
"C-ObjDump": Data,
|
||||||
|
"C2hs Haskell": Programming,
|
||||||
|
"CLIPS": Programming,
|
||||||
|
"CMake": Programming,
|
||||||
|
"COBOL": Programming,
|
||||||
|
"COLLADA": Data,
|
||||||
|
"CSON": Data,
|
||||||
|
"CSS": Markup,
|
||||||
|
"CSV": Data,
|
||||||
|
"CWeb": Programming,
|
||||||
|
"Cap'n Proto": Programming,
|
||||||
|
"CartoCSS": Programming,
|
||||||
|
"Ceylon": Programming,
|
||||||
|
"Chapel": Programming,
|
||||||
|
"Charity": Programming,
|
||||||
|
"ChucK": Programming,
|
||||||
|
"Cirru": Programming,
|
||||||
|
"Clarion": Programming,
|
||||||
|
"Clean": Programming,
|
||||||
|
"Click": Programming,
|
||||||
|
"Clojure": Programming,
|
||||||
|
"Closure Templates": Markup,
|
||||||
|
"CoffeeScript": Programming,
|
||||||
|
"ColdFusion": Programming,
|
||||||
|
"ColdFusion CFC": Programming,
|
||||||
|
"Common Lisp": Programming,
|
||||||
|
"Component Pascal": Programming,
|
||||||
|
"Cool": Programming,
|
||||||
|
"Coq": Programming,
|
||||||
|
"Cpp-ObjDump": Data,
|
||||||
|
"Creole": Prose,
|
||||||
|
"Crystal": Programming,
|
||||||
|
"Csound": Programming,
|
||||||
|
"Csound Document": Programming,
|
||||||
|
"Csound Score": Programming,
|
||||||
|
"Cuda": Programming,
|
||||||
|
"Cycript": Programming,
|
||||||
|
"Cython": Programming,
|
||||||
|
"D": Programming,
|
||||||
|
"D-ObjDump": Data,
|
||||||
|
"DIGITAL Command Language": Programming,
|
||||||
|
"DM": Programming,
|
||||||
|
"DNS Zone": Data,
|
||||||
|
"DTrace": Programming,
|
||||||
|
"Darcs Patch": Data,
|
||||||
|
"Dart": Programming,
|
||||||
|
"Diff": Data,
|
||||||
|
"Dockerfile": Data,
|
||||||
|
"Dogescript": Programming,
|
||||||
|
"Dylan": Programming,
|
||||||
|
"E": Programming,
|
||||||
|
"EBNF": Data,
|
||||||
|
"ECL": Programming,
|
||||||
|
"ECLiPSe": Programming,
|
||||||
|
"EJS": Markup,
|
||||||
|
"EQ": Programming,
|
||||||
|
"Eagle": Markup,
|
||||||
|
"Easybuild": Data,
|
||||||
|
"Ecere Projects": Data,
|
||||||
|
"Eiffel": Programming,
|
||||||
|
"Elixir": Programming,
|
||||||
|
"Elm": Programming,
|
||||||
|
"Emacs Lisp": Programming,
|
||||||
|
"EmberScript": Programming,
|
||||||
|
"Erlang": Programming,
|
||||||
|
"F#": Programming,
|
||||||
|
"FLUX": Programming,
|
||||||
|
"Factor": Programming,
|
||||||
|
"Fancy": Programming,
|
||||||
|
"Fantom": Programming,
|
||||||
|
"Filebench WML": Programming,
|
||||||
|
"Filterscript": Programming,
|
||||||
|
"Formatted": Data,
|
||||||
|
"Forth": Programming,
|
||||||
|
"Fortran": Programming,
|
||||||
|
"FreeMarker": Programming,
|
||||||
|
"Frege": Programming,
|
||||||
|
"G-code": Data,
|
||||||
|
"GAMS": Programming,
|
||||||
|
"GAP": Programming,
|
||||||
|
"GCC Machine Description": Programming,
|
||||||
|
"GDB": Programming,
|
||||||
|
"GDScript": Programming,
|
||||||
|
"GLSL": Programming,
|
||||||
|
"GN": Data,
|
||||||
|
"Game Maker Language": Programming,
|
||||||
|
"Genie": Programming,
|
||||||
|
"Genshi": Programming,
|
||||||
|
"Gentoo Ebuild": Programming,
|
||||||
|
"Gentoo Eclass": Programming,
|
||||||
|
"Gettext Catalog": Prose,
|
||||||
|
"Gherkin": Programming,
|
||||||
|
"Glyph": Programming,
|
||||||
|
"Gnuplot": Programming,
|
||||||
|
"Go": Programming,
|
||||||
|
"Golo": Programming,
|
||||||
|
"Gosu": Programming,
|
||||||
|
"Grace": Programming,
|
||||||
|
"Gradle": Data,
|
||||||
|
"Grammatical Framework": Programming,
|
||||||
|
"Graph Modeling Language": Data,
|
||||||
|
"GraphQL": Data,
|
||||||
|
"Graphviz (DOT)": Data,
|
||||||
|
"Groovy": Programming,
|
||||||
|
"Groovy Server Pages": Programming,
|
||||||
|
"HCL": Programming,
|
||||||
|
"HLSL": Programming,
|
||||||
|
"HTML": Markup,
|
||||||
|
"HTML+Django": Markup,
|
||||||
|
"HTML+ECR": Markup,
|
||||||
|
"HTML+EEX": Markup,
|
||||||
|
"HTML+ERB": Markup,
|
||||||
|
"HTML+PHP": Markup,
|
||||||
|
"HTTP": Data,
|
||||||
|
"Hack": Programming,
|
||||||
|
"Haml": Markup,
|
||||||
|
"Handlebars": Markup,
|
||||||
|
"Harbour": Programming,
|
||||||
|
"Haskell": Programming,
|
||||||
|
"Haxe": Programming,
|
||||||
|
"Hy": Programming,
|
||||||
|
"HyPhy": Programming,
|
||||||
|
"IDL": Programming,
|
||||||
|
"IGOR Pro": Programming,
|
||||||
|
"INI": Data,
|
||||||
|
"IRC log": Data,
|
||||||
|
"Idris": Programming,
|
||||||
|
"Inform 7": Programming,
|
||||||
|
"Inno Setup": Programming,
|
||||||
|
"Io": Programming,
|
||||||
|
"Ioke": Programming,
|
||||||
|
"Isabelle": Programming,
|
||||||
|
"Isabelle ROOT": Programming,
|
||||||
|
"J": Programming,
|
||||||
|
"JFlex": Programming,
|
||||||
|
"JSON": Data,
|
||||||
|
"JSON5": Data,
|
||||||
|
"JSONLD": Data,
|
||||||
|
"JSONiq": Programming,
|
||||||
|
"JSX": Programming,
|
||||||
|
"Jasmin": Programming,
|
||||||
|
"Java": Programming,
|
||||||
|
"Java Server Pages": Programming,
|
||||||
|
"JavaScript": Programming,
|
||||||
|
"Jison": Programming,
|
||||||
|
"Jison Lex": Programming,
|
||||||
|
"Jolie": Programming,
|
||||||
|
"Julia": Programming,
|
||||||
|
"Jupyter Notebook": Markup,
|
||||||
|
"KRL": Programming,
|
||||||
|
"KiCad": Programming,
|
||||||
|
"Kit": Markup,
|
||||||
|
"Kotlin": Programming,
|
||||||
|
"LFE": Programming,
|
||||||
|
"LLVM": Programming,
|
||||||
|
"LOLCODE": Programming,
|
||||||
|
"LSL": Programming,
|
||||||
|
"LabVIEW": Programming,
|
||||||
|
"Lasso": Programming,
|
||||||
|
"Latte": Markup,
|
||||||
|
"Lean": Programming,
|
||||||
|
"Less": Markup,
|
||||||
|
"Lex": Programming,
|
||||||
|
"LilyPond": Programming,
|
||||||
|
"Limbo": Programming,
|
||||||
|
"Linker Script": Data,
|
||||||
|
"Linux Kernel Module": Data,
|
||||||
|
"Liquid": Markup,
|
||||||
|
"Literate Agda": Programming,
|
||||||
|
"Literate CoffeeScript": Programming,
|
||||||
|
"Literate Haskell": Programming,
|
||||||
|
"LiveScript": Programming,
|
||||||
|
"Logos": Programming,
|
||||||
|
"Logtalk": Programming,
|
||||||
|
"LookML": Programming,
|
||||||
|
"LoomScript": Programming,
|
||||||
|
"Lua": Programming,
|
||||||
|
"M": Programming,
|
||||||
|
"M4": Programming,
|
||||||
|
"M4Sugar": Programming,
|
||||||
|
"MAXScript": Programming,
|
||||||
|
"MQL4": Programming,
|
||||||
|
"MQL5": Programming,
|
||||||
|
"MTML": Markup,
|
||||||
|
"MUF": Programming,
|
||||||
|
"Makefile": Programming,
|
||||||
|
"Mako": Programming,
|
||||||
|
"Markdown": Prose,
|
||||||
|
"Marko": Markup,
|
||||||
|
"Mask": Markup,
|
||||||
|
"Mathematica": Programming,
|
||||||
|
"Matlab": Programming,
|
||||||
|
"Maven POM": Data,
|
||||||
|
"Max": Programming,
|
||||||
|
"MediaWiki": Prose,
|
||||||
|
"Mercury": Programming,
|
||||||
|
"Meson": Programming,
|
||||||
|
"Metal": Programming,
|
||||||
|
"MiniD": Programming,
|
||||||
|
"Mirah": Programming,
|
||||||
|
"Modelica": Programming,
|
||||||
|
"Modula-2": Programming,
|
||||||
|
"Module Management System": Programming,
|
||||||
|
"Monkey": Programming,
|
||||||
|
"Moocode": Programming,
|
||||||
|
"MoonScript": Programming,
|
||||||
|
"Myghty": Programming,
|
||||||
|
"NCL": Programming,
|
||||||
|
"NL": Data,
|
||||||
|
"NSIS": Programming,
|
||||||
|
"Nemerle": Programming,
|
||||||
|
"NetLinx": Programming,
|
||||||
|
"NetLinx+ERB": Programming,
|
||||||
|
"NetLogo": Programming,
|
||||||
|
"NewLisp": Programming,
|
||||||
|
"Nginx": Markup,
|
||||||
|
"Nim": Programming,
|
||||||
|
"Ninja": Data,
|
||||||
|
"Nit": Programming,
|
||||||
|
"Nix": Programming,
|
||||||
|
"Nu": Programming,
|
||||||
|
"NumPy": Programming,
|
||||||
|
"OCaml": Programming,
|
||||||
|
"ObjDump": Data,
|
||||||
|
"Objective-C": Programming,
|
||||||
|
"Objective-C++": Programming,
|
||||||
|
"Objective-J": Programming,
|
||||||
|
"Omgrofl": Programming,
|
||||||
|
"Opa": Programming,
|
||||||
|
"Opal": Programming,
|
||||||
|
"OpenCL": Programming,
|
||||||
|
"OpenEdge ABL": Programming,
|
||||||
|
"OpenRC runscript": Programming,
|
||||||
|
"OpenSCAD": Programming,
|
||||||
|
"OpenType Feature File": Data,
|
||||||
|
"Org": Prose,
|
||||||
|
"Ox": Programming,
|
||||||
|
"Oxygene": Programming,
|
||||||
|
"Oz": Programming,
|
||||||
|
"P4": Programming,
|
||||||
|
"PAWN": Programming,
|
||||||
|
"PHP": Programming,
|
||||||
|
"PLSQL": Programming,
|
||||||
|
"PLpgSQL": Programming,
|
||||||
|
"POV-Ray SDL": Programming,
|
||||||
|
"Pan": Programming,
|
||||||
|
"Papyrus": Programming,
|
||||||
|
"Parrot": Programming,
|
||||||
|
"Parrot Assembly": Programming,
|
||||||
|
"Parrot Internal Representation": Programming,
|
||||||
|
"Pascal": Programming,
|
||||||
|
"Pep8": Programming,
|
||||||
|
"Perl": Programming,
|
||||||
|
"Perl 6": Programming,
|
||||||
|
"Pic": Markup,
|
||||||
|
"Pickle": Data,
|
||||||
|
"PicoLisp": Programming,
|
||||||
|
"PigLatin": Programming,
|
||||||
|
"Pike": Programming,
|
||||||
|
"Pod": Prose,
|
||||||
|
"PogoScript": Programming,
|
||||||
|
"Pony": Programming,
|
||||||
|
"PostScript": Markup,
|
||||||
|
"PowerBuilder": Programming,
|
||||||
|
"PowerShell": Programming,
|
||||||
|
"Processing": Programming,
|
||||||
|
"Prolog": Programming,
|
||||||
|
"Propeller Spin": Programming,
|
||||||
|
"Protocol Buffer": Markup,
|
||||||
|
"Public Key": Data,
|
||||||
|
"Pug": Markup,
|
||||||
|
"Puppet": Programming,
|
||||||
|
"Pure Data": Programming,
|
||||||
|
"PureBasic": Programming,
|
||||||
|
"PureScript": Programming,
|
||||||
|
"Python": Programming,
|
||||||
|
"Python console": Programming,
|
||||||
|
"Python traceback": Data,
|
||||||
|
"QML": Programming,
|
||||||
|
"QMake": Programming,
|
||||||
|
"R": Programming,
|
||||||
|
"RAML": Markup,
|
||||||
|
"RDoc": Prose,
|
||||||
|
"REALbasic": Programming,
|
||||||
|
"REXX": Programming,
|
||||||
|
"RHTML": Markup,
|
||||||
|
"RMarkdown": Prose,
|
||||||
|
"RPM Spec": Data,
|
||||||
|
"RUNOFF": Markup,
|
||||||
|
"Racket": Programming,
|
||||||
|
"Ragel": Programming,
|
||||||
|
"Rascal": Programming,
|
||||||
|
"Raw token data": Data,
|
||||||
|
"Reason": Programming,
|
||||||
|
"Rebol": Programming,
|
||||||
|
"Red": Programming,
|
||||||
|
"Redcode": Programming,
|
||||||
|
"Regular Expression": Data,
|
||||||
|
"Ren'Py": Programming,
|
||||||
|
"RenderScript": Programming,
|
||||||
|
"Ring": Programming,
|
||||||
|
"RobotFramework": Programming,
|
||||||
|
"Roff": Markup,
|
||||||
|
"Rouge": Programming,
|
||||||
|
"Ruby": Programming,
|
||||||
|
"Rust": Programming,
|
||||||
|
"SAS": Programming,
|
||||||
|
"SCSS": Markup,
|
||||||
|
"SMT": Programming,
|
||||||
|
"SPARQL": Data,
|
||||||
|
"SQF": Programming,
|
||||||
|
"SQL": Data,
|
||||||
|
"SQLPL": Programming,
|
||||||
|
"SRecode Template": Markup,
|
||||||
|
"STON": Data,
|
||||||
|
"SVG": Data,
|
||||||
|
"Sage": Programming,
|
||||||
|
"SaltStack": Programming,
|
||||||
|
"Sass": Markup,
|
||||||
|
"Scala": Programming,
|
||||||
|
"Scaml": Markup,
|
||||||
|
"Scheme": Programming,
|
||||||
|
"Scilab": Programming,
|
||||||
|
"Self": Programming,
|
||||||
|
"ShaderLab": Programming,
|
||||||
|
"Shell": Programming,
|
||||||
|
"ShellSession": Programming,
|
||||||
|
"Shen": Programming,
|
||||||
|
"Slash": Programming,
|
||||||
|
"Slim": Markup,
|
||||||
|
"Smali": Programming,
|
||||||
|
"Smalltalk": Programming,
|
||||||
|
"Smarty": Programming,
|
||||||
|
"SourcePawn": Programming,
|
||||||
|
"Spline Font Database": Data,
|
||||||
|
"Squirrel": Programming,
|
||||||
|
"Stan": Programming,
|
||||||
|
"Standard ML": Programming,
|
||||||
|
"Stata": Programming,
|
||||||
|
"Stylus": Markup,
|
||||||
|
"SubRip Text": Data,
|
||||||
|
"Sublime Text Config": Data,
|
||||||
|
"SuperCollider": Programming,
|
||||||
|
"Swift": Programming,
|
||||||
|
"SystemVerilog": Programming,
|
||||||
|
"TI Program": Programming,
|
||||||
|
"TLA": Programming,
|
||||||
|
"TOML": Data,
|
||||||
|
"TXL": Programming,
|
||||||
|
"Tcl": Programming,
|
||||||
|
"Tcsh": Programming,
|
||||||
|
"TeX": Markup,
|
||||||
|
"Tea": Markup,
|
||||||
|
"Terra": Programming,
|
||||||
|
"Text": Prose,
|
||||||
|
"Textile": Prose,
|
||||||
|
"Thrift": Programming,
|
||||||
|
"Turing": Programming,
|
||||||
|
"Turtle": Data,
|
||||||
|
"Twig": Markup,
|
||||||
|
"Type Language": Data,
|
||||||
|
"TypeScript": Programming,
|
||||||
|
"Unified Parallel C": Programming,
|
||||||
|
"Unity3D Asset": Data,
|
||||||
|
"Unix Assembly": Programming,
|
||||||
|
"Uno": Programming,
|
||||||
|
"UnrealScript": Programming,
|
||||||
|
"UrWeb": Programming,
|
||||||
|
"VCL": Programming,
|
||||||
|
"VHDL": Programming,
|
||||||
|
"Vala": Programming,
|
||||||
|
"Verilog": Programming,
|
||||||
|
"Vim script": Programming,
|
||||||
|
"Visual Basic": Programming,
|
||||||
|
"Volt": Programming,
|
||||||
|
"Vue": Markup,
|
||||||
|
"Wavefront Material": Data,
|
||||||
|
"Wavefront Object": Data,
|
||||||
|
"Web Ontology Language": Markup,
|
||||||
|
"WebAssembly": Programming,
|
||||||
|
"WebIDL": Programming,
|
||||||
|
"World of Warcraft Addon Data": Data,
|
||||||
|
"X10": Programming,
|
||||||
|
"XC": Programming,
|
||||||
|
"XCompose": Data,
|
||||||
|
"XML": Data,
|
||||||
|
"XPages": Programming,
|
||||||
|
"XProc": Programming,
|
||||||
|
"XQuery": Programming,
|
||||||
|
"XS": Programming,
|
||||||
|
"XSLT": Programming,
|
||||||
|
"Xojo": Programming,
|
||||||
|
"Xtend": Programming,
|
||||||
|
"YAML": Data,
|
||||||
|
"YANG": Data,
|
||||||
|
"Yacc": Programming,
|
||||||
|
"Zephir": Programming,
|
||||||
|
"Zimpl": Programming,
|
||||||
|
"desktop": Data,
|
||||||
|
"eC": Programming,
|
||||||
|
"edn": Data,
|
||||||
|
"fish": Programming,
|
||||||
|
"mupad": Programming,
|
||||||
|
"nesC": Programming,
|
||||||
|
"ooc": Programming,
|
||||||
|
"reStructuredText": Prose,
|
||||||
|
"wisp": Programming,
|
||||||
|
"xBase": Programming,
|
||||||
|
}
|
@ -3,7 +3,6 @@ package enry
|
|||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
"fmt"
|
"fmt"
|
||||||
"testing"
|
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
"github.com/stretchr/testify/assert"
|
||||||
)
|
)
|
||||||
@ -80,20 +79,3 @@ func (s *EnryTestSuite) TestIsBinary() {
|
|||||||
assert.Equal(s.T(), is, test.expected, fmt.Sprintf("%v: is = %v, expected: %v", test.name, is, test.expected))
|
assert.Equal(s.T(), is, test.expected, fmt.Sprintf("%v: is = %v, expected: %v", test.name, is, test.expected))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const (
|
|
||||||
htmlPath = "some/random/dir/file.html"
|
|
||||||
jsPath = "some/random/dir/file.js"
|
|
||||||
)
|
|
||||||
|
|
||||||
func BenchmarkVendor(b *testing.B) {
|
|
||||||
for i := 0; i < b.N; i++ {
|
|
||||||
_ = IsVendor(htmlPath)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func BenchmarkVendorJS(b *testing.B) {
|
|
||||||
for i := 0; i < b.N; i++ {
|
|
||||||
_ = IsVendor(jsPath)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
Loading…
Reference in New Issue
Block a user