Merge pull request #22 from erizocosmico/feature/generated

implement IsGenerated helper to filter out generated files
This commit is contained in:
Máximo Cuadros 2020-05-28 10:34:37 +02:00 committed by GitHub
commit e1f1b57a84
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
54 changed files with 4513 additions and 98 deletions

181
README.md
View File

@ -1,26 +1,26 @@
# go-enry [![GoDoc](https://godoc.org/github.com/go-enry/go-enry?status.svg)](https://pkg.go.dev/github.com/go-enry/go-enry/v2) [![Test](https://github.com/go-enry/go-enry/workflows/Test/badge.svg)](https://github.com/go-enry/go-enry/actions?query=workflow%3ATest+branch%3Amaster) [![codecov](https://codecov.io/gh/go-enry/go-enry/branch/master/graph/badge.svg)](https://codecov.io/gh/go-enry/go-enry)
Programming language detector and toolbox to ignore binary or vendored files. *enry*, started as a port to _Go_ of the original [Linguist](https://github.com/github/linguist) _Ruby_ library, that has an improved *2x performance*.
Programming language detector and toolbox to ignore binary or vendored files. _enry_, started as a port to _Go_ of the original [Linguist](https://github.com/github/linguist) _Ruby_ library, that has an improved _2x performance_.
* [CLI](#cli)
* [Library](#library)
* [Use cases](#use-cases)
* [By filename](#by-filename)
* [By text](#by-text)
* [By file](#by-file)
* [Filtering](#filtering-vendoring-binaries-etc)
* [Coloring](#language-colors-and-groups)
* [Languages](#languages)
* [Go](#go)
* [Java bindings](#java-bindings)
* [Python bindings](#python-bindings)
* [Divergences from linguist](#divergences-from-linguist)
* [Benchmarks](#benchmarks)
* [Why Enry?](#why-enry)
* [Development](#development)
* [Sync with github/linguist upstream](#sync-with-githublinguist-upstream)
* [Misc](#misc)
* [License](#license)
- [CLI](#cli)
- [Library](#library)
- [Use cases](#use-cases)
- [By filename](#by-filename)
- [By text](#by-text)
- [By file](#by-file)
- [Filtering](#filtering-vendoring-binaries-etc)
- [Coloring](#language-colors-and-groups)
- [Languages](#languages)
- [Go](#go)
- [Java bindings](#java-bindings)
- [Python bindings](#python-bindings)
- [Divergences from linguist](#divergences-from-linguist)
- [Benchmarks](#benchmarks)
- [Why Enry?](#why-enry)
- [Development](#development)
- [Sync with github/linguist upstream](#sync-with-githublinguist-upstream)
- [Misc](#misc)
- [License](#license)
# CLI
@ -28,51 +28,62 @@ The CLI binary is hosted in a separate repository [go-enry/enry](https://github.
# Library
*enry* is also a Go library for guessing a programming language that exposes API through FFI to multiple programming environments.
_enry_ is also a Go library for guessing a programming language that exposes API through FFI to multiple programming environments.
## Use cases
*enry* guesses a programming language using a sequence of matching *strategies* that are
applied progressively to narrow down the possible options. Each *strategy* varies on the type
_enry_ guesses a programming language using a sequence of matching _strategies_ that are
applied progressively to narrow down the possible options. Each _strategy_ varies on the type
of input data that it needs to make a decision: file name, extension, the first line of the file, the full content of the file, etc.
Depending on available input data, enry API can be roughly divided into the next categories or use cases.
### By filename
Next functions require only a name of the file to make a guess:
- `GetLanguageByExtension` uses only file extension (wich may be ambiguous)
- `GetLanguageByFilename` useful for cases like `.gitignore`, `.bashrc`, etc
- all [filtering helpers](#filtering)
Please note that such guesses are expected not to be very accurate.
Next functions require only a name of the file to make a guess:
- `GetLanguageByExtension` uses only file extension (wich may be ambiguous)
- `GetLanguageByFilename` useful for cases like `.gitignore`, `.bashrc`, etc
- all [filtering helpers](#filtering)
Please note that such guesses are expected not to be very accurate.
### By text
To make a guess only based on the content of the file or a text snippet, use
- `GetLanguageByShebang` reads only the first line of text to identify the [shebang](https://en.wikipedia.org/wiki/Shebang_(Unix)).
- `GetLanguageByModeline` for cases when Vim/Emacs modeline e.g. `/* vim: set ft=cpp: */` may be present at a head or a tail of the text.
- `GetLanguageByClassifier` uses a Bayesian classifier trained on all the `./samples/` from Linguist.
It usually is a last-resort strategy that is used to disambiguate the guess of the previous strategies, and thus it requires a list of "candidate" guesses. One can provide a list of all known languages - keys from the `data.LanguagesLogProbabilities` as possible candidates if more intelligent hypotheses are not available, at the price of possibly suboptimal accuracy.
To make a guess only based on the content of the file or a text snippet, use
- `GetLanguageByShebang` reads only the first line of text to identify the [shebang](<https://en.wikipedia.org/wiki/Shebang_(Unix)>).
- `GetLanguageByModeline` for cases when Vim/Emacs modeline e.g. `/* vim: set ft=cpp: */` may be present at a head or a tail of the text.
- `GetLanguageByClassifier` uses a Bayesian classifier trained on all the `./samples/` from Linguist.
It usually is a last-resort strategy that is used to disambiguate the guess of the previous strategies, and thus it requires a list of "candidate" guesses. One can provide a list of all known languages - keys from the `data.LanguagesLogProbabilities` as possible candidates if more intelligent hypotheses are not available, at the price of possibly suboptimal accuracy.
### By file
The most accurate guess would be one when both, the file name and the content are available:
- `GetLanguagesByContent` only uses file extension and a set of regexp-based content heuristics.
- `GetLanguages` uses the full set of matching strategies and is expected to be most accurate.
- `GetLanguagesByContent` only uses file extension and a set of regexp-based content heuristics.
- `GetLanguages` uses the full set of matching strategies and is expected to be most accurate.
### Filtering: vendoring, binaries, etc
*enry* expose a set of file-level helpers `Is*` to simplify filtering out the files that are less interesting for the purpose of source code analysis:
- `IsBinary`
- `IsVendor`
- `IsConfiguration`
- `IsDocumentation`
- `IsDotFile`
- `IsImage`
- `IsTest`
_enry_ expose a set of file-level helpers `Is*` to simplify filtering out the files that are less interesting for the purpose of source code analysis:
- `IsBinary`
- `IsVendor`
- `IsConfiguration`
- `IsDocumentation`
- `IsDotFile`
- `IsImage`
- `IsTest`
- `IsGenerated`
### Language colors and groups
*enry* exposes function to get language color to use for example in presenting statistics in graphs:
- `GetColor`
- `GetLanguageGroup` can be used to group similar languages together e.g. for `Less` this function will return `CSS`
_enry_ exposes function to get language color to use for example in presenting statistics in graphs:
- `GetColor`
- `GetLanguageGroup` can be used to group similar languages together e.g. for `Less` this function will return `CSS`
## Languages
@ -137,39 +148,36 @@ Generated Python bindings using a C shared library and cffi are WIP under [src-d
A library is going to be published on pypi as [enry](https://pypi.org/project/enry/) for
macOS and linux platforms. Windows support is planned under [src-d/enry#150](https://github.com/src-d/enry/issues/150).
Divergences from Linguist
------------
## Divergences from Linguist
The `enry` library is based on the data from `github/linguist` version **v7.9.0**.
Parsing [linguist/samples](https://github.com/github/linguist/tree/master/samples) the following `enry` results are different from the Linguist:
* [Heuristics for ".es" extension](https://github.com/github/linguist/blob/e761f9b013e5b61161481fcb898b59721ee40e3d/lib/linguist/heuristics.yml#L103) in JavaScript could not be parsed, due to unsupported backreference in RE2 regexp engine.
- [Heuristics for ".es" extension](https://github.com/github/linguist/blob/e761f9b013e5b61161481fcb898b59721ee40e3d/lib/linguist/heuristics.yml#L103) in JavaScript could not be parsed, due to unsupported backreference in RE2 regexp engine.
* [Heuristics for ".rno" extension](https://github.com/github/linguist/blob/3a1bd3c3d3e741a8aaec4704f782e06f5cd2a00d/lib/linguist/heuristics.yml#L365) in RUNOFF could not be parsed, due to unsupported lookahead in RE2 regexp engine.
- [Heuristics for ".rno" extension](https://github.com/github/linguist/blob/3a1bd3c3d3e741a8aaec4704f782e06f5cd2a00d/lib/linguist/heuristics.yml#L365) in RUNOFF could not be parsed, due to unsupported lookahead in RE2 regexp engine.
* [Heuristics for ".inc" extension](https://github.com/github/linguist/blob/f0e2d0d7f1ce600b2a5acccaef6b149c87d8b99c/lib/linguist/heuristics.yml#L222) in NASL could not be parsed, due to unsupported possessive quantifier in RE2 regexp engine.
- [Heuristics for ".inc" extension](https://github.com/github/linguist/blob/f0e2d0d7f1ce600b2a5acccaef6b149c87d8b99c/lib/linguist/heuristics.yml#L222) in NASL could not be parsed, due to unsupported possessive quantifier in RE2 regexp engine.
* As of [Linguist v5.3.2](https://github.com/github/linguist/releases/tag/v5.3.2) it is using [flex-based scanner in C for tokenization](https://github.com/github/linguist/pull/3846). Enry still uses [extract_token](https://github.com/github/linguist/pull/3846/files#diff-d5179df0b71620e3fac4535cd1368d15L60) regex-based algorithm. See [#193](https://github.com/src-d/enry/issues/193).
- As of [Linguist v5.3.2](https://github.com/github/linguist/releases/tag/v5.3.2) it is using [flex-based scanner in C for tokenization](https://github.com/github/linguist/pull/3846). Enry still uses [extract_token](https://github.com/github/linguist/pull/3846/files#diff-d5179df0b71620e3fac4535cd1368d15L60) regex-based algorithm. See [#193](https://github.com/src-d/enry/issues/193).
* Bayesian classifier can't distinguish "SQL" from "PLpgSQL. See [#194](https://github.com/src-d/enry/issues/194).
- Bayesian classifier can't distinguish "SQL" from "PLpgSQL. See [#194](https://github.com/src-d/enry/issues/194).
* Detection of [generated files](https://github.com/github/linguist/blob/bf95666fc15e49d556f2def4d0a85338423c25f3/lib/linguist/generated.rb#L53) is not supported yet.
(Thus they are not excluded from CLI output). See [#213](https://github.com/src-d/enry/issues/213).
- Detection of [generated files](https://github.com/github/linguist/blob/bf95666fc15e49d556f2def4d0a85338423c25f3/lib/linguist/generated.rb#L53) is not supported yet.
(Thus they are not excluded from CLI output). See [#213](https://github.com/src-d/enry/issues/213).
* XML detection strategy is not implemented. See [#192](https://github.com/src-d/enry/issues/192).
- XML detection strategy is not implemented. See [#192](https://github.com/src-d/enry/issues/192).
* Overriding languages and types though `.gitattributes` is not yet supported. See [#18](https://github.com/src-d/enry/issues/18).
- Overriding languages and types though `.gitattributes` is not yet supported. See [#18](https://github.com/src-d/enry/issues/18).
* `enry` CLI output does NOT exclude `.gitignore`ed files and git submodules, as Linguist does
- `enry` CLI output does NOT exclude `.gitignore`ed files and git submodules, as Linguist does
In all the cases above that have an issue number - we plan to update enry to match Linguist behavior.
## Benchmarks
Benchmarks
------------
Enry's language detection has been compared with Linguist's on [*linguist/samples*](https://github.com/github/linguist/tree/master/samples).
Enry's language detection has been compared with Linguist's on [_linguist/samples_](https://github.com/github/linguist/tree/master/samples).
We got these results:
@ -183,9 +191,7 @@ Go regexp engine being slower than Ruby's on, wich is based on [oniguruma](https
See [instructions](#misc) for running enry with oniguruma.
Why Enry?
------------
## Why Enry?
In the movie [My Fair Lady](https://en.wikipedia.org/wiki/My_Fair_Lady), [Professor Henry Higgins](http://www.imdb.com/character/ch0011719/) is a linguist who at the very beginning of the movie enjoys guessing the origin of people based on their accent.
@ -200,10 +206,9 @@ To run the tests use:
Setting `ENRY_TEST_REPO` to the path to existing checkout of Linguist will avoid cloning it and sepeed tests up.
Setting `ENRY_DEBUG=1` will provide insight in the Bayesian classifier building done by `make code-generate`.
### Sync with github/linguist upstream
*enry* re-uses parts of the original [github/linguist](https://github.com/github/linguist) to generate internal data structures.
_enry_ re-uses parts of the original [github/linguist](https://github.com/github/linguist) to generate internal data structures.
In order to update to the latest release of linguist do:
```bash
@ -218,10 +223,10 @@ $ make code-generate
To stay in sync, enry needs to be updated when a new release of the linguist includes changes to any of the following files:
* [languages.yml](https://github.com/github/linguist/blob/master/lib/linguist/languages.yml)
* [heuristics.yml](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.yml)
* [vendor.yml](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml)
* [documentation.yml](https://github.com/github/linguist/blob/master/lib/linguist/documentation.yml)
- [languages.yml](https://github.com/github/linguist/blob/master/lib/linguist/languages.yml)
- [heuristics.yml](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.yml)
- [vendor.yml](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml)
- [documentation.yml](https://github.com/github/linguist/blob/master/lib/linguist/documentation.yml)
There is no automation for detecting the changes in the linguist project, so this process above has to be done manually from time to time.
@ -230,8 +235,6 @@ the generated files (in [data](https://github.com/go-enry/go-enry/blob/master/da
Separating all the necessary "manual" code changes to a different PR that includes some background description and an update to the documentation on ["divergences from linguist"](#divergences-from-linguist) is very much appreciated as it simplifies the maintenance (review/release notes/etc).
## Misc
<details>
@ -239,19 +242,20 @@ Separating all the necessary "manual" code changes to a different PR that includ
### Benchmark
All benchmark scripts are in [*benchmarks*](https://github.com/go-enry/go-enry/blob/master/benchmarks) directory.
All benchmark scripts are in [_benchmarks_](https://github.com/go-enry/go-enry/blob/master/benchmarks) directory.
#### Dependencies
As benchmarks depend on Ruby and Github-Linguist gem make sure you have:
- Ruby (e.g using [`rbenv`](https://github.com/rbenv/rbenv)), [`bundler`](https://bundler.io/) installed
- Docker
- [native dependencies](https://github.com/github/linguist/#dependencies) installed
- Build the gem `cd .linguist && bundle install && rake build_gem && cd -`
- Install it `gem install --no-rdoc --no-ri --local .linguist/github-linguist-*.gem`
As benchmarks depend on Ruby and Github-Linguist gem make sure you have:
- Ruby (e.g using [`rbenv`](https://github.com/rbenv/rbenv)), [`bundler`](https://bundler.io/) installed
- Docker
- [native dependencies](https://github.com/github/linguist/#dependencies) installed
- Build the gem `cd .linguist && bundle install && rake build_gem && cd -`
- Install it `gem install --no-rdoc --no-ri --local .linguist/github-linguist-*.gem`
#### Quick benchmark
To run quicker benchmarks
make benchmarks
@ -260,19 +264,20 @@ to get average times for the primary detection function and strategies for the w
make benchmarks-samples
#### Full benchmark
If you want to reproduce the same benchmarks as reported above:
- Make sure all [dependencies](#benchmark-dependencies) are installed
- Install [gnuplot](http://gnuplot.info) (in order to plot the histogram)
- Run `ENRY_TEST_REPO="$PWD/.linguist" benchmarks/run.sh` (takes ~15h)
- Make sure all [dependencies](#benchmark-dependencies) are installed
- Install [gnuplot](http://gnuplot.info) (in order to plot the histogram)
- Run `ENRY_TEST_REPO="$PWD/.linguist" benchmarks/run.sh` (takes ~15h)
It will run the benchmarks for enry and Linguist, parse the output, create csv files and plot the histogram.
### Faster regexp engine (optional)
[Oniguruma](https://github.com/kkos/oniguruma) is CRuby's regular expression engine.
It is very fast and performs better than the one built into Go runtime. *enry* supports swapping
It is very fast and performs better than the one built into Go runtime. _enry_ supports swapping
between those two engines thanks to [rubex](https://github.com/moovweb/rubex) project.
The typical overall speedup from using Oniguruma is 1.5-2x. However, it requires CGo and the external shared library.
On macOS with [Homebrew](https://brew.sh/), it is:
@ -297,8 +302,6 @@ and then rebuild the project.
</details>
License
------------
## License
Apache License, Version 2.0. See [LICENSE](LICENSE)

42
_testdata/C/image.c Normal file
View File

@ -0,0 +1,42 @@
/* GIMP RGB C-Source image dump (image.c) */
static const struct {
guint width;
guint height;
guint bytes_per_pixel; /* 2:RGB16, 3:RGB, 4:RGBA */
guint8 pixel_data[16 * 16 * 3 + 1];
} gimp_image = {
16, 16, 3,
"\377\063\000\377T\001\377\207\001\377\272\000\377\354\000\341\377\001\256\377\000{\377\001"
"Z\377\000(\377\000\000\377\014\001\377>\000\377q\001\377\243\000\377\325\001\377\367\377T"
"\001\377\207\001\377\271\000\377\354\000\341\377\001\256\377\001{\377\001Z\377\001(\377\001"
"\000\377\014\001\377>\001\377q\001\377\243\000\377\325\001\377\367\001\325\377\377\207\000"
"\377\272\001\377\354\000\341\377\001\257\377\001|\377\000[\377\001(\377\000\001\377\013\001"
"\377>\000\377q\000\377\244\000\377\325\000\377\367\000\325\377\001\243\377\377\272\001"
"\377\354\001\340\377\001\256\377\000|\377\001Z\377\000(\377\000\001\377\014\001\377>\000\377"
"q\000\377\243\001\377\326\000\377\370\001\325\377\001\243\377\000q\377\377\354\000\341"
"\377\001\257\377\000|\377\000Z\377\001(\377\000\000\377\013\000\377?\001\377q\001\377\243\000"
"\377\325\001\377\370\000\326\377\000\243\377\001q\377\000>\377\340\377\001\257\377\001"
"{\377\001Z\377\000(\377\001\000\377\014\001\377>\001\377q\000\377\243\000\377\325\000\377\367"
"\001\325\377\000\243\377\000p\377\001?\377\000\014\377\256\377\000|\377\000Z\377\001'\377"
"\001\001\377\014\001\377>\000\377p\001\377\243\001\377\326\001\377\367\000\326\377\000\243\377"
"\000q\377\000>\377\000\014\377(\000\377|\377\001J\377\001(\377\001\001\377\013\000\377>\001\377"
"q\000\377\243\000\377\325\000\377\367\001\326\377\000\243\377\000p\377\000>\377\001\014\377"
"'\000\377J\001\377I\377\001'\377\000\001\377\013\001\377>\001\377p\000\377\243\000\377\326\001"
"\367\377\000\326\377\000\243\377\000q\377\001>\377\001\014\377(\001\377I\001\377|\001\377"
"'\377\001\000\377\014\000\377>\001\377p\000\377\243\000\377\326\000\367\377\000\325\377\000"
"\243\377\001p\377\001?\377\001\014\377(\000\377Z\000\377|\000\377\256\000\377\001\377\013\000"
"\377>\001\377p\000\377\243\001\377\325\001\367\377\001\326\377\000\243\377\000p\377\001?"
"\377\001\013\377(\000\377Z\000\377|\000\377\256\000\377\341\001\377\000\377?\001\377p\001\377"
"\243\001\377\326\000\367\377\001\326\377\001\243\377\001q\377\001>\377\000\014\377(\000\377"
"Z\001\377|\000\377\256\001\377\341\000\377\377\000\354\000\377p\000\377\243\001\377\326\001"
"\367\377\001\326\377\000\243\377\001q\377\000>\377\001\013\377(\001\377Z\000\377|\001\377"
"\257\000\377\341\000\377\377\000\354\377\001\271\000\377\243\000\377\326\001\367\377\001"
"\326\377\001\243\377\000q\377\000>\377\000\014\377'\001\377Z\000\377{\001\377\256\000\377"
"\341\001\377\377\001\354\377\001\271\377\000\207\000\377\325\001\367\377\000\326\377\000"
"\243\377\001q\377\001>\377\001\014\377(\000\377Z\001\377|\000\377\256\000\377\340\000\377"
"\377\000\354\377\001\271\377\000\207\377\000T\000\367\377\001\325\377\000\243\377\000p\377"
"\001>\377\001\014\377(\001\377Z\000\377|\001\377\256\000\377\341\001\377\377\000\354\377\000"
"\271\377\000\207\377\000T\377\001\063",
};

31
_testdata/C/image.h Normal file
View File

@ -0,0 +1,31 @@
/* GIMP header image file format (RGB): image.h */
static unsigned int width = 16;
static unsigned int height = 16;
/* Call this macro repeatedly. After each use, the pixel data can be extracted */
#define HEADER_PIXEL(data,pixel) {\
pixel[0] = (((data[0] - 33) << 2) | ((data[1] - 33) >> 4)); \
pixel[1] = ((((data[1] - 33) & 0xF) << 4) | ((data[2] - 33) >> 2)); \
pixel[2] = ((((data[2] - 33) & 0x3) << 6) | ((data[3] - 33))); \
data += 4; \
}
static char *header_data =
"`T-!`V1\"`Y=\"`\\I!`_Q!Y@]\"LP]!?`]\"7P]!+0]!!0]-!@]_!0^R!@_D!0`6!@`X"
"`V1\"`Y=\"`\\E!`_Q!Y@]\"LP]\"?`]\"7P]\"+0]\"!0]-!@]_!@^R!@_D!0`6!@`X!>8`"
"`Y=!`\\I\"`_Q!Y@]\"L`]\"@0]!7`]\"+0]!!@],!@]_!0^R!0_E!0`6!0`X!.8`!;0`"
"`\\I\"`_Q\"Y0]\"LP]!@0]\"7P]!+0]!!@]-!@]_!0^R!0_D!@`7!0`Y!>8`!;0`!((`"
"`_Q!Y@]\"L`]!@0]!7P]\"+0]!!0],!0]`!@^R!@_D!0`6!@`Y!.<`!+0`!8(`!$\\`"
"Y0]\"L`]\"?`]\"7P]!+0]\"!0]-!@]_!@^R!0_D!0`6!0`X!>8`!+0`!($`!4``!!T`"
"LP]!@0]!7P]\"*`]\"!@]-!@]_!0^Q!@_D!@`7!@`X!.<`!+0`!((`!$\\`!!T`+!$`"
"@0]\"3P]\"+0]\"!@],!0]_!@^R!0_D!0`6!0`X!><`!+0`!($`!$\\`!1T`*Q$`3A(`"
"3@]\"*`]!!@],!@]_!@^Q!0_D!0`7!@@`!.<`!+0`!((`!4\\`!1T`+!(`31(`@!(`"
"*`]\"!0]-!0]_!@^Q!0_D!0`7!0@`!.8`!+0`!8$`!4``!1T`+!$`7A$`@!$`LA$`"
"!@],!0]_!@^Q!0_D!@`6!@@`!><`!+0`!($`!4``!1P`+!$`7A$`@!$`LA$`Y1(`"
"!0]`!@^Q!@_D!@`7!0@`!><`!;0`!8(`!4\\`!!T`+!$`7A(`@!$`LA(`Y1$``Q$M"
"!0^Q!0_D!@`7!@@`!><`!+0`!8(`!$\\`!1P`+!(`7A$`@!(`LQ$`Y1$``Q$M`Q'Z"
"!0_D!0`7!@@`!><`!;0`!((`!$\\`!!T`*Q(`7A$`?Q(`LA$`Y1(``Q(M`Q'Z`Q#("
"!0`6!@@`!.<`!+0`!8(`!4\\`!1T`+!$`7A(`@!$`LA$`Y!$``Q$M`Q'Z`Q#(`Q\"5"
"!0@`!>8`!+0`!($`!4\\`!1T`+!(`7A$`@!(`LA$`Y1(``Q$M`Q#Z`Q#(`Q\"5`Q%T"
"";

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,12 @@
/** Begin line maps. **/{ “file”:”out.js”, "count": 2 }
[0,0,0,0,0,0,1,1,1,1,2]
[2,2,2,2,2,2,3,4,4,4,4,4]
/** Begin file information. **/
[“a.js”, “b.js”]
[“b.js”, “c.js”, “d.js”]
/** Begin mapping definitions. **/
["a.js", 1, 34]
["a.js", 5, 2]
["b.js", 1, 3, "event"]
["c.js", 1, 4]
["d.js", 3, 78, "foo"]

View File

@ -0,0 +1 @@
{"version":3,"file":"out.js","sourceRoot":"","sources":["foo.js","bar.js"],"sourcesContent":[null,null],"names":["src","maps","are","fun"],"mappings":"A,AAAB;;ABCDE;"}

View File

@ -0,0 +1,86 @@
// Generated by Haxe 4.0.5
#include <hxcpp.h>
#ifndef INCLUDED_Main
#include <Main.h>
#endif
#ifndef INCLUDED_haxe_Log
#include <haxe/Log.h>
#endif
HX_LOCAL_STACK_FRAME(_hx_pos_e47a9afac0942eb9_3_main,"Main","main",0xed0e206e,"Main.main","Main.hx",3,0x087e5c05)
void Main_obj::__construct() { }
Dynamic Main_obj::__CreateEmpty() { return new Main_obj; }
void *Main_obj::_hx_vtable = 0;
Dynamic Main_obj::__Create(hx::DynamicArray inArgs)
{
hx::ObjectPtr< Main_obj > _hx_result = new Main_obj();
_hx_result->__construct();
return _hx_result;
}
bool Main_obj::_hx_isInstanceOf(int inClassId) {
return inClassId==(int)0x00000001 || inClassId==(int)0x332f6459;
}
void Main_obj::main(){
HX_STACKFRAME(&_hx_pos_e47a9afac0942eb9_3_main)
HXDLIN( 3) ::haxe::Log_obj::trace(HX_("Hello World",84,f6,db,6f),hx::SourceInfo(HX_("source/Main.hx",91,d3,a7,40),3,HX_("Main",59,64,2f,33),HX_("main",39,38,56,48)));
}
STATIC_HX_DEFINE_DYNAMIC_FUNC0(Main_obj,main,(void))
Main_obj::Main_obj()
{
}
bool Main_obj::__GetStatic(const ::String &inName, Dynamic &outValue, hx::PropertyAccess inCallProp)
{
switch(inName.length) {
case 4:
if (HX_FIELD_EQ(inName,"main") ) { outValue = main_dyn(); return true; }
}
return false;
}
#ifdef HXCPP_SCRIPTABLE
static hx::StorageInfo *Main_obj_sMemberStorageInfo = 0;
static hx::StaticInfo *Main_obj_sStaticStorageInfo = 0;
#endif
hx::Class Main_obj::__mClass;
static ::String Main_obj_sStaticFields[] = {
HX_("main",39,38,56,48),
::String(null())
};
void Main_obj::__register()
{
Main_obj _hx_dummy;
Main_obj::_hx_vtable = *(void **)&_hx_dummy;
hx::Static(__mClass) = new hx::Class_obj();
__mClass->mName = HX_("Main",59,64,2f,33);
__mClass->mSuper = &super::__SGetClass();
__mClass->mConstructEmpty = &__CreateEmpty;
__mClass->mConstructArgs = &__Create;
__mClass->mGetStaticField = &Main_obj::__GetStatic;
__mClass->mSetStaticField = &hx::Class_obj::SetNoStaticField;
__mClass->mStatics = hx::Class_obj::dupFunctions(Main_obj_sStaticFields);
__mClass->mMembers = hx::Class_obj::dupFunctions(0 /* sMemberFields */);
__mClass->mCanCast = hx::TCanCast< Main_obj >;
#ifdef HXCPP_SCRIPTABLE
__mClass->mMemberStorageInfo = Main_obj_sMemberStorageInfo;
#endif
#ifdef HXCPP_SCRIPTABLE
__mClass->mStaticStorageInfo = Main_obj_sStaticStorageInfo;
#endif
hx::_hx_RegisterClass(__mClass->mName, __mClass);
}

View File

@ -0,0 +1,37 @@
// Generated by Haxe 4.0.5
#pragma warning disable 109, 114, 219, 429, 168, 162
public class EntryPoint__Main {
public static void Main() {
global::cs.Boot.init();
{
global::Main.main();
}
}
}
public class Main : global::haxe.lang.HxObject {
public Main(global::haxe.lang.EmptyObject empty) {
}
public Main() {
global::Main.__hx_ctor__Main(this);
}
protected static void __hx_ctor__Main(global::Main __hx_this) {
}
public static void main() {
unchecked {
global::haxe.Log.trace.__hx_invoke2_o(default(double), "Hello World", default(double), new global::haxe.lang.DynamicObject(new int[]{302979532, 1547539107, 1648581351}, new object[]{"main", "Main", "source/Main.hx"}, new int[]{1981972957}, new double[]{((double) (3) )}));
}
}
}

View File

@ -0,0 +1,58 @@
// Generated by Haxe 4.0.5
#ifndef INCLUDED_Main
#define INCLUDED_Main
#ifndef HXCPP_H
#include <hxcpp.h>
#endif
HX_DECLARE_CLASS0(Main)
class HXCPP_CLASS_ATTRIBUTES Main_obj : public hx::Object
{
public:
typedef hx::Object super;
typedef Main_obj OBJ_;
Main_obj();
public:
enum { _hx_ClassId = 0x332f6459 };
void __construct();
inline void *operator new(size_t inSize, bool inContainer=false,const char *inName="Main")
{ return hx::Object::operator new(inSize,inContainer,inName); }
inline void *operator new(size_t inSize, int extra)
{ return hx::Object::operator new(inSize+extra,false,"Main"); }
inline static hx::ObjectPtr< Main_obj > __new() {
hx::ObjectPtr< Main_obj > __this = new Main_obj();
__this->__construct();
return __this;
}
inline static hx::ObjectPtr< Main_obj > __alloc(hx::Ctx *_hx_ctx) {
Main_obj *__this = (Main_obj*)(hx::Ctx::alloc(_hx_ctx, sizeof(Main_obj), false, "Main"));
*(void **)__this = Main_obj::_hx_vtable;
return __this;
}
static void * _hx_vtable;
static Dynamic __CreateEmpty();
static Dynamic __Create(hx::DynamicArray inArgs);
//~Main_obj();
HX_DO_RTTI_ALL;
static bool __GetStatic(const ::String &inString, Dynamic &outValue, hx::PropertyAccess inCallProp);
static void __register();
bool _hx_isInstanceOf(int inClassId);
::String __ToString() const { return HX_("Main",59,64,2f,33); }
static void main();
static ::Dynamic main_dyn();
};
#endif /* INCLUDED_Main */

View File

@ -0,0 +1,41 @@
// Generated by Haxe 4.0.5
package haxe.root;
import haxe.root.*;
@SuppressWarnings(value={"rawtypes", "unchecked"})
public class Main extends haxe.lang.HxObject
{
public static void main(String[] args)
{
haxe.java.Init.init();
{
haxe.root.Main.main();
}
}
public Main(haxe.lang.EmptyObject empty)
{
}
public Main()
{
haxe.root.Main.__hx_ctor__Main(this);
}
protected static void __hx_ctor__Main(haxe.root.Main __hx_this)
{
}
public static void main()
{
haxe.Log.trace.__hx_invoke2_o(0.0, "Hello World", 0.0, new haxe.lang.DynamicObject(new java.lang.String[]{"className", "fileName", "methodName"}, new java.lang.Object[]{"Main", "source/Main.hx", "main"}, new java.lang.String[]{"lineNumber"}, new double[]{((double) (((double) (3) )) )}));
}
}

View File

@ -0,0 +1,31 @@
<?php
/**
* Generated by Haxe 4.0.5
*/
use \php\_Boot\HxAnon;
use \php\Boot;
use \haxe\Log;
class Main {
/**
* @return void
*/
public static function main () {
#source/Main.hx:3: characters 3-8
(Log::$trace)("Hello World", new HxAnon([
"fileName" => "source/Main.hx",
"lineNumber" => 3,
"className" => "Main",
"methodName" => "main",
]));
}
/**
* @return void
*/
public function __construct () {
}
}
Boot::registerClass(Main::class, 'Main');

View File

@ -0,0 +1,8 @@
// Generated by Haxe 4.0.5
(function ($global) { "use strict";
var Main = function() { };
Main.main = function() {
console.log("source/Main.hx:3:","Hello World");
};
Main.main();
})({});

View File

@ -0,0 +1,816 @@
-- Generated by Haxe 4.0.5
local _hx_array_mt = {
__newindex = function(t,k,v)
local len = t.length
t.length = k >= len and (k + 1) or len
rawset(t,k,v)
end
}
local function _hx_tab_array(tab,length)
tab.length = length
return setmetatable(tab, _hx_array_mt)
end
local function _hx_anon_newindex(t,k,v) t.__fields__[k] = true; rawset(t,k,v); end
local _hx_anon_mt = {__newindex=_hx_anon_newindex}
local function _hx_a(...)
local __fields__ = {};
local ret = {__fields__ = __fields__};
local max = select('#',...);
local tab = {...};
local cur = 1;
while cur < max do
local v = tab[cur];
__fields__[v] = true;
ret[v] = tab[cur+1];
cur = cur + 2
end
return setmetatable(ret, _hx_anon_mt)
end
local function _hx_e()
return setmetatable({__fields__ = {}}, _hx_anon_mt)
end
local function _hx_o(obj)
return setmetatable(obj, _hx_anon_mt)
end
local function _hx_new(prototype)
return setmetatable({__fields__ = {}}, {__newindex=_hx_anon_newindex, __index=prototype})
end
local _hxClasses = {}
local Int = _hx_e();
local Dynamic = _hx_e();
local Float = _hx_e();
local Bool = _hx_e();
local Class = _hx_e();
local Enum = _hx_e();
local Array = _hx_e()
__lua_lib_luautf8_Utf8 = _G.require("lua-utf8")
local Main = _hx_e()
local Math = _hx_e()
local String = _hx_e()
local Std = _hx_e()
__haxe_Log = _hx_e()
__lua_Boot = _hx_e()
local _hx_bind, _hx_bit, _hx_staticToInstance, _hx_funcToField, _hx_maxn, _hx_print, _hx_apply_self, _hx_box_mr, _hx_bit_clamp, _hx_table, _hx_bit_raw
local _hx_pcall_default = {};
local _hx_pcall_break = {};
Array.new = function()
local self = _hx_new(Array.prototype)
Array.super(self)
return self
end
Array.super = function(self)
_hx_tab_array(self, 0);
end
Array.prototype = _hx_a();
Array.prototype.concat = function(self,a)
local _g = _hx_tab_array({}, 0);
local _g1 = 0;
local _g2 = self;
while (_g1 < _g2.length) do
local i = _g2[_g1];
_g1 = _g1 + 1;
_g:push(i);
end;
local ret = _g;
local _g3 = 0;
while (_g3 < a.length) do
local i1 = a[_g3];
_g3 = _g3 + 1;
ret:push(i1);
end;
do return ret end
end
Array.prototype.join = function(self,sep)
local tbl = ({});
local _gthis = self;
local cur_length = 0;
local i = _hx_o({__fields__={hasNext=true,next=true},hasNext=function(self)
do return cur_length < _gthis.length end;
end,next=function(self)
cur_length = cur_length + 1;
do return _gthis[cur_length - 1] end;
end});
while (i:hasNext()) do
local i1 = i:next();
_G.table.insert(tbl, Std.string(i1));
end;
do return _G.table.concat(tbl, sep) end
end
Array.prototype.pop = function(self)
if (self.length == 0) then
do return nil end;
end;
local ret = self[self.length - 1];
self[self.length - 1] = nil;
self.length = self.length - 1;
do return ret end
end
Array.prototype.push = function(self,x)
self[self.length] = x;
do return self.length end
end
Array.prototype.reverse = function(self)
local tmp;
local i = 0;
while (i < Std.int(self.length / 2)) do
tmp = self[i];
self[i] = self[(self.length - i) - 1];
self[(self.length - i) - 1] = tmp;
i = i + 1;
end;
end
Array.prototype.shift = function(self)
if (self.length == 0) then
do return nil end;
end;
local ret = self[0];
if (self.length == 1) then
self[0] = nil;
else
if (self.length > 1) then
self[0] = self[1];
_G.table.remove(self, 1);
end;
end;
local tmp = self;
tmp.length = tmp.length - 1;
do return ret end
end
Array.prototype.slice = function(self,pos,_end)
if ((_end == nil) or (_end > self.length)) then
_end = self.length;
else
if (_end < 0) then
_end = _G.math.fmod((self.length - (_G.math.fmod(-_end, self.length))), self.length);
end;
end;
if (pos < 0) then
pos = _G.math.fmod((self.length - (_G.math.fmod(-pos, self.length))), self.length);
end;
if ((pos > _end) or (pos > self.length)) then
do return _hx_tab_array({}, 0) end;
end;
local ret = _hx_tab_array({}, 0);
local _g = pos;
local _g1 = _end;
while (_g < _g1) do
_g = _g + 1;
local i = _g - 1;
ret:push(self[i]);
end;
do return ret end
end
Array.prototype.sort = function(self,f)
local i = 0;
local l = self.length;
while (i < l) do
local swap = false;
local j = 0;
local max = (l - i) - 1;
while (j < max) do
if (f(self[j], self[j + 1]) > 0) then
local tmp = self[j + 1];
self[j + 1] = self[j];
self[j] = tmp;
swap = true;
end;
j = j + 1;
end;
if (not swap) then
break;
end;
i = i + 1;
end;
end
Array.prototype.splice = function(self,pos,len)
if ((len < 0) or (pos > self.length)) then
do return _hx_tab_array({}, 0) end;
else
if (pos < 0) then
pos = self.length - (_G.math.fmod(-pos, self.length));
end;
end;
len = Math.min(len, self.length - pos);
local ret = _hx_tab_array({}, 0);
local _g = pos;
local _g1 = pos + len;
while (_g < _g1) do
_g = _g + 1;
local i = _g - 1;
ret:push(self[i]);
self[i] = self[i + len];
end;
local _g2 = pos + len;
local _g3 = self.length;
while (_g2 < _g3) do
_g2 = _g2 + 1;
local i1 = _g2 - 1;
self[i1] = self[i1 + len];
end;
local tmp = self;
tmp.length = tmp.length - len;
do return ret end
end
Array.prototype.toString = function(self)
local tbl = ({});
_G.table.insert(tbl, "[");
_G.table.insert(tbl, self:join(","));
_G.table.insert(tbl, "]");
do return _G.table.concat(tbl, "") end
end
Array.prototype.unshift = function(self,x)
local len = self.length;
local _g = 0;
local _g1 = len;
while (_g < _g1) do
_g = _g + 1;
local i = _g - 1;
self[len - i] = self[(len - i) - 1];
end;
self[0] = x;
end
Array.prototype.insert = function(self,pos,x)
if (pos > self.length) then
pos = self.length;
end;
if (pos < 0) then
pos = self.length + pos;
if (pos < 0) then
pos = 0;
end;
end;
local cur_len = self.length;
while (cur_len > pos) do
self[cur_len] = self[cur_len - 1];
cur_len = cur_len - 1;
end;
self[pos] = x;
end
Array.prototype.remove = function(self,x)
local _g = 0;
local _g1 = self.length;
while (_g < _g1) do
_g = _g + 1;
local i = _g - 1;
if (self[i] == x) then
local _g2 = i;
local _g11 = self.length - 1;
while (_g2 < _g11) do
_g2 = _g2 + 1;
local j = _g2 - 1;
self[j] = self[j + 1];
end;
self[self.length - 1] = nil;
self.length = self.length - 1;
do return true end;
end;
end;
do return false end
end
Array.prototype.indexOf = function(self,x,fromIndex)
local _end = self.length;
if (fromIndex == nil) then
fromIndex = 0;
else
if (fromIndex < 0) then
fromIndex = self.length + fromIndex;
if (fromIndex < 0) then
fromIndex = 0;
end;
end;
end;
local _g = fromIndex;
local _g1 = _end;
while (_g < _g1) do
_g = _g + 1;
local i = _g - 1;
if (x == self[i]) then
do return i end;
end;
end;
do return -1 end
end
Array.prototype.lastIndexOf = function(self,x,fromIndex)
if ((fromIndex == nil) or (fromIndex >= self.length)) then
fromIndex = self.length - 1;
else
if (fromIndex < 0) then
fromIndex = self.length + fromIndex;
if (fromIndex < 0) then
do return -1 end;
end;
end;
end;
local i = fromIndex;
while (i >= 0) do
if (self[i] == x) then
do return i end;
else
i = i - 1;
end;
end;
do return -1 end
end
Array.prototype.copy = function(self)
local _g = _hx_tab_array({}, 0);
local _g1 = 0;
local _g2 = self;
while (_g1 < _g2.length) do
local i = _g2[_g1];
_g1 = _g1 + 1;
_g:push(i);
end;
do return _g end
end
Array.prototype.map = function(self,f)
local _g = _hx_tab_array({}, 0);
local _g1 = 0;
local _g2 = self;
while (_g1 < _g2.length) do
local i = _g2[_g1];
_g1 = _g1 + 1;
_g:push(f(i));
end;
do return _g end
end
Array.prototype.filter = function(self,f)
local _g = _hx_tab_array({}, 0);
local _g1 = 0;
local _g2 = self;
while (_g1 < _g2.length) do
local i = _g2[_g1];
_g1 = _g1 + 1;
if (f(i)) then
_g:push(i);
end;
end;
do return _g end
end
Array.prototype.iterator = function(self)
local _gthis = self;
local cur_length = 0;
do return _hx_o({__fields__={hasNext=true,next=true},hasNext=function(self)
do return cur_length < _gthis.length end;
end,next=function(self)
cur_length = cur_length + 1;
do return _gthis[cur_length - 1] end;
end}) end
end
Array.prototype.resize = function(self,len)
if (self.length < len) then
self.length = len;
else
if (self.length > len) then
local _g = len;
local _g1 = self.length;
while (_g < _g1) do
_g = _g + 1;
local i = _g - 1;
self[i] = nil;
end;
self.length = len;
end;
end;
end
Main.new = {}
Main.main = function()
__haxe_Log.trace("Hello World", _hx_o({__fields__={fileName=true,lineNumber=true,className=true,methodName=true},fileName="source/Main.hx",lineNumber=3,className="Main",methodName="main"}));
end
Math.new = {}
Math.isNaN = function(f)
do return f ~= f end;
end
Math.isFinite = function(f)
if (f > -_G.math.huge) then
do return f < _G.math.huge end;
else
do return false end;
end;
end
Math.min = function(a,b)
if (Math.isNaN(a) or Math.isNaN(b)) then
do return (0/0) end;
else
do return _G.math.min(a, b) end;
end;
end
String.new = function(string)
local self = _hx_new(String.prototype)
String.super(self,string)
self = string
return self
end
String.super = function(self,string)
end
String.__index = function(s,k)
if (k == "length") then
do return __lua_lib_luautf8_Utf8.len(s) end;
else
local o = String.prototype;
local field = k;
if ((function()
local _hx_1
if ((_G.type(o) == "string") and ((String.prototype[field] ~= nil) or (field == "length"))) then
_hx_1 = true; elseif (o.__fields__ ~= nil) then
_hx_1 = o.__fields__[field] ~= nil; else
_hx_1 = o[field] ~= nil; end
return _hx_1
end )()) then
do return String.prototype[k] end;
else
if (String.__oldindex ~= nil) then
if (_G.type(String.__oldindex) == "function") then
do return String.__oldindex(s, k) end;
else
if (_G.type(String.__oldindex) == "table") then
do return String.__oldindex[k] end;
end;
end;
do return nil end;
else
do return nil end;
end;
end;
end;
end
String.fromCharCode = function(code)
do return __lua_lib_luautf8_Utf8.char(code) end;
end
String.prototype = _hx_a();
String.prototype.toUpperCase = function(self)
do return __lua_lib_luautf8_Utf8.upper(self) end
end
String.prototype.toLowerCase = function(self)
do return __lua_lib_luautf8_Utf8.lower(self) end
end
String.prototype.indexOf = function(self,str,startIndex)
if (startIndex == nil) then
startIndex = 1;
else
startIndex = startIndex + 1;
end;
local r = __lua_lib_luautf8_Utf8.find(self, str, startIndex, true);
if ((r ~= nil) and (r > 0)) then
do return r - 1 end;
else
do return -1 end;
end;
end
String.prototype.lastIndexOf = function(self,str,startIndex)
local i = 0;
local ret = -1;
if (startIndex == nil) then
startIndex = __lua_lib_luautf8_Utf8.len(self);
end;
while (true) do
local startIndex1 = ret + 1;
if (startIndex1 == nil) then
startIndex1 = 1;
else
startIndex1 = startIndex1 + 1;
end;
local r = __lua_lib_luautf8_Utf8.find(self, str, startIndex1, true);
local p = (function()
local _hx_1
if ((r ~= nil) and (r > 0)) then
_hx_1 = r - 1; else
_hx_1 = -1; end
return _hx_1
end )();
if ((p == -1) or (p > startIndex)) then
break;
end;
ret = p;
end;
do return ret end
end
String.prototype.split = function(self,delimiter)
local idx = 1;
local ret = _hx_tab_array({}, 0);
local delim_offset = (function()
local _hx_1
if (__lua_lib_luautf8_Utf8.len(delimiter) > 0) then
_hx_1 = __lua_lib_luautf8_Utf8.len(delimiter); else
_hx_1 = 1; end
return _hx_1
end )();
while (idx ~= nil) do
local newidx = 0;
if (__lua_lib_luautf8_Utf8.len(delimiter) > 0) then
newidx = __lua_lib_luautf8_Utf8.find(self, delimiter, idx, true);
else
if (idx >= __lua_lib_luautf8_Utf8.len(self)) then
newidx = nil;
else
newidx = idx + 1;
end;
end;
if (newidx ~= nil) then
local match = __lua_lib_luautf8_Utf8.sub(self, idx, newidx - 1);
ret:push(match);
idx = newidx + __lua_lib_luautf8_Utf8.len(delimiter);
else
ret:push(__lua_lib_luautf8_Utf8.sub(self, idx, __lua_lib_luautf8_Utf8.len(self)));
idx = nil;
end;
end;
do return ret end
end
String.prototype.toString = function(self)
do return self end
end
String.prototype.substring = function(self,startIndex,endIndex)
if (endIndex == nil) then
endIndex = __lua_lib_luautf8_Utf8.len(self);
end;
if (endIndex < 0) then
endIndex = 0;
end;
if (startIndex < 0) then
startIndex = 0;
end;
if (endIndex < startIndex) then
do return __lua_lib_luautf8_Utf8.sub(self, endIndex + 1, startIndex) end;
else
do return __lua_lib_luautf8_Utf8.sub(self, startIndex + 1, endIndex) end;
end;
end
String.prototype.charAt = function(self,index)
do return __lua_lib_luautf8_Utf8.sub(self, index + 1, index + 1) end
end
String.prototype.charCodeAt = function(self,index)
do return __lua_lib_luautf8_Utf8.byte(self, index + 1) end
end
String.prototype.substr = function(self,pos,len)
if ((len == nil) or (len > (pos + __lua_lib_luautf8_Utf8.len(self)))) then
len = __lua_lib_luautf8_Utf8.len(self);
else
if (len < 0) then
len = __lua_lib_luautf8_Utf8.len(self) + len;
end;
end;
if (pos < 0) then
pos = __lua_lib_luautf8_Utf8.len(self) + pos;
end;
if (pos < 0) then
pos = 0;
end;
do return __lua_lib_luautf8_Utf8.sub(self, pos + 1, pos + len) end
end
Std.new = {}
Std.string = function(s)
do return __lua_Boot.__string_rec(s) end;
end
Std.int = function(x)
if (not Math.isFinite(x) or Math.isNaN(x)) then
do return 0 end;
else
do return _hx_bit_clamp(x) end;
end;
end
__haxe_Log.new = {}
__haxe_Log.formatOutput = function(v,infos)
local str = Std.string(v);
if (infos == nil) then
do return str end;
end;
local pstr = Std.string(Std.string(infos.fileName) .. Std.string(":")) .. Std.string(infos.lineNumber);
if (infos.customParams ~= nil) then
local _g = 0;
local _g1 = infos.customParams;
while (_g < _g1.length) do
local v1 = _g1[_g];
_g = _g + 1;
str = Std.string(str) .. Std.string((Std.string(", ") .. Std.string(Std.string(v1))));
end;
end;
do return Std.string(Std.string(pstr) .. Std.string(": ")) .. Std.string(str) end;
end
__haxe_Log.trace = function(v,infos)
local str = __haxe_Log.formatOutput(v, infos);
_hx_print(str);
end
__lua_Boot.new = {}
__lua_Boot.isArray = function(o)
if (_G.type(o) == "table") then
if ((o.__enum__ == nil) and (_G.getmetatable(o) ~= nil)) then
do return _G.getmetatable(o).__index == Array.prototype end;
else
do return false end;
end;
else
do return false end;
end;
end
__lua_Boot.printEnum = function(o,s)
if (o.length == 2) then
do return o[0] end;
else
local str = Std.string(Std.string(o[0])) .. Std.string("(");
s = Std.string(s) .. Std.string("\t");
local _g = 2;
local _g1 = o.length;
while (_g < _g1) do
_g = _g + 1;
local i = _g - 1;
if (i ~= 2) then
str = Std.string(str) .. Std.string((Std.string(",") .. Std.string(__lua_Boot.__string_rec(o[i], s))));
else
str = Std.string(str) .. Std.string(__lua_Boot.__string_rec(o[i], s));
end;
end;
do return Std.string(str) .. Std.string(")") end;
end;
end
__lua_Boot.printClassRec = function(c,result,s)
if (result == nil) then
result = "";
end;
local f = __lua_Boot.__string_rec;
for k,v in pairs(c) do if result ~= '' then result = result .. ', ' end result = result .. k .. ':' .. f(v, s.. ' ') end;
do return result end;
end
__lua_Boot.__string_rec = function(o,s)
if (s == nil) then
s = "";
end;
if (__lua_lib_luautf8_Utf8.len(s) >= 5) then
do return "<...>" end;
end;
local _g = type(o);
if (_g) == "boolean" then
do return tostring(o) end;
elseif (_g) == "function" then
do return "<function>" end;
elseif (_g) == "nil" then
do return "null" end;
elseif (_g) == "number" then
if (o == _G.math.huge) then
do return "Infinity" end;
else
if (o == -_G.math.huge) then
do return "-Infinity" end;
else
if (o == 0) then
do return "0" end;
else
if (o ~= o) then
do return "NaN" end;
else
do return tostring(o) end;
end;
end;
end;
end;
elseif (_g) == "string" then
do return o end;
elseif (_g) == "table" then
if (o.__enum__ ~= nil) then
do return __lua_Boot.printEnum(o, s) end;
else
if ((_hx_wrap_if_string_field(o,'toString') ~= nil) and not __lua_Boot.isArray(o)) then
do return _hx_wrap_if_string_field(o,'toString')(o) end;
else
if (__lua_Boot.isArray(o)) then
local o2 = o;
if (__lua_lib_luautf8_Utf8.len(s) > 5) then
do return "[...]" end;
else
local _g1 = _hx_tab_array({}, 0);
local _g11 = 0;
while (_g11 < o2.length) do
local i = o2[_g11];
_g11 = _g11 + 1;
_g1:push(__lua_Boot.__string_rec(i, Std.string(s) .. Std.string(1)));
end;
do return Std.string(Std.string("[") .. Std.string(_g1:join(","))) .. Std.string("]") end;
end;
else
if (o.__class__ ~= nil) then
do return Std.string(Std.string("{") .. Std.string(__lua_Boot.printClassRec(o, "", Std.string(s) .. Std.string("\t")))) .. Std.string("}") end;
else
local fields = __lua_Boot.fieldIterator(o);
local buffer = ({});
local first = true;
_G.table.insert(buffer, "{ ");
local f = fields;
while (f:hasNext()) do
local f1 = f:next();
if (first) then
first = false;
else
_G.table.insert(buffer, ", ");
end;
_G.table.insert(buffer, Std.string(Std.string(Std.string("") .. Std.string(Std.string(f1))) .. Std.string(" : ")) .. Std.string(__lua_Boot.__string_rec(o[f1], Std.string(s) .. Std.string("\t"))));
end;
_G.table.insert(buffer, " }");
do return _G.table.concat(buffer, "") end;
end;
end;
end;
end;
elseif (_g) == "thread" then
do return "<thread>" end;
elseif (_g) == "userdata" then
local mt = _G.getmetatable(o);
if ((mt ~= nil) and (mt.__tostring ~= nil)) then
do return _G.tostring(o) end;
else
do return "<userdata>" end;
end;else
_G.error("Unknown Lua type",0); end;
end
__lua_Boot.fieldIterator = function(o)
if (_G.type(o) ~= "table") then
do return _hx_o({__fields__={next=true,hasNext=true},next=function(self)
do return nil end;
end,hasNext=function(self)
do return false end;
end}) end;
end;
local tbl = (function()
local _hx_1
if (o.__fields__ ~= nil) then
_hx_1 = o.__fields__; else
_hx_1 = o; end
return _hx_1
end )();
local cur = _G.pairs(tbl);
local next_valid = function(tbl1,val)
while (__lua_Boot.hiddenFields[val] ~= nil) do
val = cur(tbl1, val);
end;
do return val end;
end;
local cur_val = next_valid(tbl, cur(tbl, nil));
do return _hx_o({__fields__={next=true,hasNext=true},next=function(self)
local ret = cur_val;
cur_val = next_valid(tbl, cur(tbl, cur_val));
do return ret end;
end,hasNext=function(self)
do return cur_val ~= nil end;
end}) end;
end
_hx_bit_clamp = function(v)
if v <= 2147483647 and v >= -2147483648 then
if v > 0 then return _G.math.floor(v)
else return _G.math.ceil(v)
end
end
if v > 2251798999999999 then v = v*2 end;
if (v ~= v or math.abs(v) == _G.math.huge) then return nil end
return _hx_bit.band(v, 2147483647 ) - math.abs(_hx_bit.band(v, 2147483648))
end
-- require this for lua 5.1
pcall(require, 'bit')
if bit then
_hx_bit = bit
else
local _hx_bit_raw = _G.require('bit32')
_hx_bit = setmetatable({}, { __index = _hx_bit_raw });
-- lua 5.2 weirdness
_hx_bit.bnot = function(...) return _hx_bit_clamp(_hx_bit_raw.bnot(...)) end;
_hx_bit.bxor = function(...) return _hx_bit_clamp(_hx_bit_raw.bxor(...)) end;
end
_hx_array_mt.__index = Array.prototype
local _hx_static_init = function()
__lua_Boot.hiddenFields = {__id__=true, hx__closures=true, super=true, prototype=true, __fields__=true, __ifields__=true, __class__=true, __properties__=true}
end
_hx_print = print or (function() end)
_hx_wrap_if_string_field = function(o, fld)
if _G.type(o) == 'string' then
if fld == 'length' then
return _G.string.len(o)
else
return String.prototype[fld]
end
else
return o[fld]
end
end
_hx_static_init();
Main.main()

View File

@ -0,0 +1,28 @@
# Generated by Haxe 4.0.5
# coding: utf-8
import sys
class Main:
__slots__ = ()
@staticmethod
def main():
print("Hello World")
class python_internal_MethodClosure:
__slots__ = ("obj", "func")
def __init__(self,obj,func):
self.obj = obj
self.func = func
def __call__(self,*args):
return self.func(self.obj,*args)
Main.main()

View File

@ -0,0 +1,2 @@
.accordion{padding:0;margin:0;position:relative;list-style:none}.accordion>*{position:absolute;overflow:hidden;padding:0;margin:0}.accordion .accordion,.accordion.edge-visible,.accordion>*{-webkit-transition:.3s ease all;-moz-transition:.3s ease all;-o-transition:.3s ease all;transition:.3s ease all}.accordion,.accordion>*{will-change:height,transform;-webkit-perspective:90em;-moz-perspective:90em;perspective:90em;-webkit-backface-visibility:hidden;-moz-backface-visibility:hidden;backface-visibility:hidden;-webkit-transform:translate3d(0,0,0);-moz-transform:translate3d(0,0,0);-ms-transform:translateY(0);-o-transform:translate3d(0,0,0);transform:translate3d(0,0,0)}.snap.accordion .accordion,.snap.accordion>*{-webkit-transition:none!important;-moz-transition:none!important;-o-transition:none!important;transition:none!important}.accordion>*>:first-child{cursor:pointer;margin:0;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.accordion>*>:last-child{overflow:hidden;-webkit-transition:.3s ease height,.3s step-start visibility;-moz-transition:.3s ease height,.3s step-start visibility;-o-transition:.3s ease height,.3s step-start visibility;transition:.3s ease height,.3s step-start visibility}.accordion>.closed .accordion>.open>:last-child,.accordion>.closed>:last-child{-webkit-transition-timing-function:ease,step-end;-moz-transition-timing-function:ease,step-end;-o-transition-timing-function:ease,step-end;transition-timing-function:ease,step-end;visibility:hidden}
/*# sourceMappingURL=data:application/json;charset=utf-8;base64,eyJ2ZXJzaW9uIjozLCJzb3VyY2VzIjpbImFjY29yZGlvbi5jc3MiXSwibmFtZXMiOltdLCJtYXBwaW5ncyI6IkFBQ0EsV0FDQyxRQUFTLEVBQ1QsT0FBUyxFQUNULFNBQVUsU0FDVixXQUFZLEtBRVosYUFDQyxTQUFVLFNBQ1YsU0FBVSxPQUNWLFFBQVMsRUFDVCxPQUFTLEVBSVQsc0JBREEsd0JBREEsYUFHQyxtQkFBb0IsSUFBSSxLQUFLLElBQzdCLGdCQUFvQixJQUFJLEtBQUssSUFDN0IsY0FBb0IsSUFBSSxLQUFLLElBQzdCLFdBQW9CLElBQUksS0FBSyxJQUk5QixXQUNBLGFBQ0MsWUFBYSxNQUFNLENBQUUsVUFDckIsb0JBQXFCLEtBQ3JCLGlCQUFxQixLQUNyQixZQUFxQixLQUVyQiw0QkFBNkIsT0FDN0IseUJBQTZCLE9BQzdCLG9CQUE2QixPQUU3QixrQkFBb0IsbUJBQ3BCLGVBQW9CLG1CQUNwQixjQUFvQixjQUNwQixhQUFvQixtQkFDcEIsVUFBb0IsbUJBS3JCLDJCQURBLGtCQUVDLG1CQUFvQixlQUNwQixnQkFBb0IsZUFDcEIsY0FBb0IsZUFDcEIsV0FBb0IsZUFJckIsMEJBQ0MsT0FBUSxRQUNSLE9BQVEsRUFFUixvQkFBcUIsS0FDckIsaUJBQXFCLEtBQ3JCLGdCQUFxQixLQUNyQixZQUFxQixLQUl0Qix5QkFDQyxTQUFVLE9BQ1YsbUJBQW9CLElBQUksS0FBSyxNQUFNLENBQUUsSUFBSSxXQUFXLFdBQ3BELGdCQUFvQixJQUFJLEtBQUssTUFBTSxDQUFFLElBQUksV0FBVyxXQUNwRCxjQUFvQixJQUFJLEtBQUssTUFBTSxDQUFFLElBQUksV0FBVyxXQUNwRCxXQUFvQixJQUFJLEtBQUssTUFBTSxDQUFFLElBQUksV0FBVyxXQUdwRCxnREFEQSwrQkFFQyxtQ0FBb0MsSUFBSSxDQUFFLFNBQzFDLGdDQUFvQyxJQUFJLENBQUUsU0FDMUMsOEJBQW9DLElBQUksQ0FBRSxTQUMxQywyQkFBb0MsSUFBSSxDQUFFLFNBQzFDLFdBQVkiLCJzb3VyY2VzQ29udGVudCI6WyIvKiogQ29yZSBzdHlsaW5nICovXG4uYWNjb3JkaW9ue1xuXHRwYWRkaW5nOiAwO1xuXHRtYXJnaW46ICAwO1xuXHRwb3NpdGlvbjogcmVsYXRpdmU7XG5cdGxpc3Qtc3R5bGU6IG5vbmU7XG59XG5cdC5hY2NvcmRpb24gPiAqe1xuXHRcdHBvc2l0aW9uOiBhYnNvbHV0ZTtcblx0XHRvdmVyZmxvdzogaGlkZGVuO1xuXHRcdHBhZGRpbmc6IDA7XG5cdFx0bWFyZ2luOiAgMDtcblx0fVxuXHRcdC5hY2NvcmRpb24gPiAqLFxuXHRcdC5hY2NvcmRpb24uZWRnZS12aXNpYmxlLFxuXHRcdC5hY2NvcmRpb24gLmFjY29yZGlvbntcblx0XHRcdC13ZWJraXQtdHJhbnNpdGlvbjogLjNzIGVhc2UgYWxsO1xuXHRcdFx0LW1vei10cmFuc2l0aW9uOiAgICAuM3MgZWFzZSBhbGw7XG5cdFx0XHQtby10cmFuc2l0aW9uOiAgICAgIC4zcyBlYXNlIGFsbDtcblx0XHRcdHRyYW5zaXRpb246ICAgICAgICAgLjNzIGVhc2UgYWxsO1xuXHRcdH1cblx0XHRcblx0XHQvKiogVHJhbnNmb3JtLXJlbGF0ZWQgKi9cblx0XHQuYWNjb3JkaW9uLFxuXHRcdC5hY2NvcmRpb24gPiAqe1xuXHRcdFx0d2lsbC1jaGFuZ2U6IGhlaWdodCwgdHJhbnNmb3JtO1xuXHRcdFx0LXdlYmtpdC1wZXJzcGVjdGl2ZTogOTBlbTtcblx0XHRcdC1tb3otcGVyc3BlY3RpdmU6ICAgIDkwZW07XG5cdFx0XHRwZXJzcGVjdGl2ZTogICAgICAgICA5MGVtO1xuXHRcdFx0XG5cdFx0XHQtd2Via2l0LWJhY2tmYWNlLXZpc2liaWxpdHk6IGhpZGRlbjtcblx0XHRcdC1tb3otYmFja2ZhY2UtdmlzaWJpbGl0eTogICAgaGlkZGVuO1xuXHRcdFx0YmFja2ZhY2UtdmlzaWJpbGl0eTogICAgICAgICBoaWRkZW47XG5cdFx0XHRcblx0XHRcdC13ZWJraXQtdHJhbnNmb3JtOiAgdHJhbnNsYXRlM2QoMCwwLDApO1xuXHRcdFx0LW1vei10cmFuc2Zvcm06ICAgICB0cmFuc2xhdGUzZCgwLDAsMCk7XG5cdFx0XHQtbXMtdHJhbnNmb3JtOiAgICAgIHRyYW5zbGF0ZVkoMCk7XG5cdFx0XHQtby10cmFuc2Zvcm06ICAgICAgIHRyYW5zbGF0ZTNkKDAsMCwwKTtcblx0XHRcdHRyYW5zZm9ybTogICAgICAgICAgdHJhbnNsYXRlM2QoMCwwLDApO1xuXHRcdH1cblx0XHRcblx0XHQvKiogUnVsZSB0byBkaXNhYmxlIHRyYW5zaXRpb25zIGJldHdlZW4gZ2FwIGNvcnJlY3Rpb25zICovXG5cdFx0LnNuYXAuYWNjb3JkaW9uID4gKixcblx0XHQuc25hcC5hY2NvcmRpb24gLmFjY29yZGlvbntcblx0XHRcdC13ZWJraXQtdHJhbnNpdGlvbjogbm9uZSAhaW1wb3J0YW50O1xuXHRcdFx0LW1vei10cmFuc2l0aW9uOiAgICBub25lICFpbXBvcnRhbnQ7XG5cdFx0XHQtby10cmFuc2l0aW9uOiAgICAgIG5vbmUgIWltcG9ydGFudDtcblx0XHRcdHRyYW5zaXRpb246ICAgICAgICAgbm9uZSAhaW1wb3J0YW50O1xuXHRcdH1cblxuXHRcdC8qKiBIZWFkaW5ncyAqL1xuXHRcdC5hY2NvcmRpb24gPiAqID4gOmZpcnN0LWNoaWxke1xuXHRcdFx0Y3Vyc29yOiBwb2ludGVyO1xuXHRcdFx0bWFyZ2luOiAwO1xuXHRcdFx0XG5cdFx0XHQtd2Via2l0LXVzZXItc2VsZWN0OiBub25lO1xuXHRcdFx0LW1vei11c2VyLXNlbGVjdDogICAgbm9uZTtcblx0XHRcdC1tcy11c2VyLXNlbGVjdDogICAgIG5vbmU7XG5cdFx0XHR1c2VyLXNlbGVjdDogICAgICAgICBub25lO1xuXHRcdH1cblx0XHRcblx0XHQvKiogQ29sbGFwc2libGUgY29udGVudCAqL1xuXHRcdC5hY2NvcmRpb24gPiAqID4gOmxhc3QtY2hpbGR7XG5cdFx0XHRvdmVyZmxvdzogaGlkZGVuO1xuXHRcdFx0LXdlYmtpdC10cmFuc2l0aW9uOiAuM3MgZWFzZSBoZWlnaHQsIC4zcyBzdGVwLXN0YXJ0IHZpc2liaWxpdHk7XG5cdFx0XHQtbW96LXRyYW5zaXRpb246ICAgIC4zcyBlYXNlIGhlaWdodCwgLjNzIHN0ZXAtc3RhcnQgdmlzaWJpbGl0eTtcblx0XHRcdC1vLXRyYW5zaXRpb246ICAgICAgLjNzIGVhc2UgaGVpZ2h0LCAuM3Mgc3RlcC1zdGFydCB2aXNpYmlsaXR5O1xuXHRcdFx0dHJhbnNpdGlvbjogICAgICAgICAuM3MgZWFzZSBoZWlnaHQsIC4zcyBzdGVwLXN0YXJ0IHZpc2liaWxpdHk7XG5cdFx0fVxuXHRcdFx0LmFjY29yZGlvbiA+IC5jbG9zZWQgPiA6bGFzdC1jaGlsZCxcblx0XHRcdC5hY2NvcmRpb24gPiAuY2xvc2VkIC5hY2NvcmRpb24gPiAub3BlbiA+IDpsYXN0LWNoaWxke1xuXHRcdFx0XHQtd2Via2l0LXRyYW5zaXRpb24tdGltaW5nLWZ1bmN0aW9uOiBlYXNlLCBzdGVwLWVuZDtcblx0XHRcdFx0LW1vei10cmFuc2l0aW9uLXRpbWluZy1mdW5jdGlvbjogICAgZWFzZSwgc3RlcC1lbmQ7XG5cdFx0XHRcdC1vLXRyYW5zaXRpb24tdGltaW5nLWZ1bmN0aW9uOiAgICAgIGVhc2UsIHN0ZXAtZW5kO1xuXHRcdFx0XHR0cmFuc2l0aW9uLXRpbWluZy1mdW5jdGlvbjogICAgICAgICBlYXNlLCBzdGVwLWVuZDtcblx0XHRcdFx0dmlzaWJpbGl0eTogaGlkZGVuO1xuXHRcdFx0fVxuIl19 */

View File

@ -0,0 +1,13 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
class Foo {
constructor(){
this.name = "Foo";
}
}
exports.Foo = Foo;
//# sourceMappingURL=data:application/json;charset=utf-8;base64,eyJ2ZXJzaW9uIjozLCJmaWxlIjoiZXNtLmpzIiwic291cmNlcyI6WyJlc20ubWpzIl0sInNvdXJjZXNDb250ZW50IjpbImV4cG9ydCBjbGFzcyBGb28ge1xuXHRjb25zdHJ1Y3Rvcigpe1xuXHRcdHRoaXMubmFtZSA9IFwiRm9vXCI7XG5cdH1cbn07XG4iXSwibmFtZXMiOltdLCJtYXBwaW5ncyI6Ijs7OztBQUFPLE1BQU0sR0FBRyxDQUFDO0NBQ2hCLFdBQVcsRUFBRTtFQUNaLElBQUksQ0FBQyxJQUFJLEdBQUcsS0FBSyxDQUFDO0VBQ2xCO0NBQ0Q7Ozs7In0=

View File

@ -0,0 +1,2 @@
.accordion{padding:0;margin:0;position:relative;list-style:none}.accordion>*{position:absolute;overflow:hidden;padding:0;margin:0}.accordion .accordion,.accordion.edge-visible,.accordion>*{-webkit-transition:.3s ease all;-moz-transition:.3s ease all;-o-transition:.3s ease all;transition:.3s ease all}.accordion,.accordion>*{will-change:height,transform;-webkit-perspective:90em;-moz-perspective:90em;perspective:90em;-webkit-backface-visibility:hidden;-moz-backface-visibility:hidden;backface-visibility:hidden;-webkit-transform:translate3d(0,0,0);-moz-transform:translate3d(0,0,0);-ms-transform:translateY(0);-o-transform:translate3d(0,0,0);transform:translate3d(0,0,0)}.snap.accordion .accordion,.snap.accordion>*{-webkit-transition:none!important;-moz-transition:none!important;-o-transition:none!important;transition:none!important}.accordion>*>:first-child{cursor:pointer;margin:0;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.accordion>*>:last-child{overflow:hidden;-webkit-transition:.3s ease height,.3s step-start visibility;-moz-transition:.3s ease height,.3s step-start visibility;-o-transition:.3s ease height,.3s step-start visibility;transition:.3s ease height,.3s step-start visibility}.accordion>.closed .accordion>.open>:last-child,.accordion>.closed>:last-child{-webkit-transition-timing-function:ease,step-end;-moz-transition-timing-function:ease,step-end;-o-transition-timing-function:ease,step-end;transition-timing-function:ease,step-end;visibility:hidden}
/*# sourceMappingURL=linked.css.map */

View File

@ -0,0 +1 @@
{"version":3,"sources":["accordion.css"],"names":[],"mappings":"AACA,WACC,QAAS,EACT,OAAS,EACT,SAAU,SACV,WAAY,KAEZ,aACC,SAAU,SACV,SAAU,OACV,QAAS,EACT,OAAS,EAIT,sBADA,wBADA,aAGC,mBAAoB,IAAI,KAAK,IAC7B,gBAAoB,IAAI,KAAK,IAC7B,cAAoB,IAAI,KAAK,IAC7B,WAAoB,IAAI,KAAK,IAI9B,WACA,aACC,YAAa,MAAM,CAAE,UACrB,oBAAqB,KACrB,iBAAqB,KACrB,YAAqB,KAErB,4BAA6B,OAC7B,yBAA6B,OAC7B,oBAA6B,OAE7B,kBAAoB,mBACpB,eAAoB,mBACpB,cAAoB,cACpB,aAAoB,mBACpB,UAAoB,mBAKrB,2BADA,kBAEC,mBAAoB,eACpB,gBAAoB,eACpB,cAAoB,eACpB,WAAoB,eAIrB,0BACC,OAAQ,QACR,OAAQ,EAER,oBAAqB,KACrB,iBAAqB,KACrB,gBAAqB,KACrB,YAAqB,KAItB,yBACC,SAAU,OACV,mBAAoB,IAAI,KAAK,MAAM,CAAE,IAAI,WAAW,WACpD,gBAAoB,IAAI,KAAK,MAAM,CAAE,IAAI,WAAW,WACpD,cAAoB,IAAI,KAAK,MAAM,CAAE,IAAI,WAAW,WACpD,WAAoB,IAAI,KAAK,MAAM,CAAE,IAAI,WAAW,WAGpD,gDADA,+BAEC,mCAAoC,IAAI,CAAE,SAC1C,gCAAoC,IAAI,CAAE,SAC1C,8BAAoC,IAAI,CAAE,SAC1C,2BAAoC,IAAI,CAAE,SAC1C,WAAY"}

View File

@ -0,0 +1,7 @@
1.0.0←ed6a955d-5826-4f98-a450-10b414266c27←ed6a955d-5826-4f98-a450-10b414266c27|{
"option_gameguid": "e429c80a-2ae4-494a-bee0-f7ed5e39ec9b"
}←1225f6b0-ac20-43bd-a82e-be73fa0b6f4f|{
"targets": 461609314234257646
}←7b2c4976-1e09-44e5-8256-c527145e03bb|{
"targets": 461609314234257646
}

View File

@ -0,0 +1,30 @@
#if 0
<<'SKIP';
#endif
/*
----------------------------------------------------------------------
ppport.h -- Perl/Pollution/Portability Version 3.36
Automatically created by Devel::PPPort running under perl 5.026002.
Do NOT edit this file directly! -- Edit PPPort_pm.PL and the
includes in parts/inc/ instead.
Use 'perldoc ppport.h' to view the documentation below.
----------------------------------------------------------------------
SKIP
=pod
=head1 NAME
ppport.h - Perl/Pollution/Portability version 3.36
=head1 SYNOPSIS
perl ppport.h [options] [source files]
<< TRUNCATED >>

View File

@ -0,0 +1,8 @@
<!DOCTYPE html>
<html>
<head>
<meta
content = "Org mode"
name = "generator" >
</head>
</html>

View File

@ -0,0 +1,12 @@
<!DOCTYPE html>
<html>
<head>
<meta
data-foo="Bar"
http-equiv="Content-Type: text/html; charset=UTF-8"
content = "Org mode"
id="Some wicked id"
wrong-content="whoops"
name = "generator" >
</head>
</html>

View File

@ -0,0 +1,6 @@
<!DOCTYPE html>
<html>
<head>
<meta name = "generator" content = "Org mode" />
</head>
</html>

View File

@ -0,0 +1,7 @@
<!DOCTYPE html>
<html>
<meta >
<meta name="generator" content="something invalid">
<meta name="generator" content="makeinfo 4.8"
/>
</html>

View File

@ -0,0 +1,21 @@
<!-- Creator : groff version 1.22.4 -->
<!-- CreationDate: Tue Jul 2 20:06:41 2019 -->
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<meta name="generator" content="groff -Thtml, see www.gnu.org">
<meta http-equiv="Content-Type" content="text/html; charset=US-ASCII">
<meta name="Content-Style" content="text/css">
<style type="text/css">
p { margin-top: 0; margin-bottom: 0; vertical-align: top }
pre { margin-top: 0; margin-bottom: 0; vertical-align: top }
table { margin-top: 0; margin-bottom: 0; vertical-align: top }
h1 { text-align: center }
</style>
<title></title>
</head>
<body>
... document truncated
</body>
</html>

View File

@ -0,0 +1,25 @@
<?xml version="1.0" encoding="us-ascii"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1 plus MathML 2.0//EN"
"http://www.w3.org/TR/MathML2/dtd/xhtml-math11-f.dtd"
[<!ENTITY mathml "http://www.w3.org/1998/Math/MathML">]>
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en">
<head>
<meta name="generator" content="groff -Txhtml, see www.gnu.org"/>
<meta http-equiv="Content-Type" content="text/html; charset=US-ASCII"/>
<meta name="Content-Style" content="text/css"/>
<style type="text/css">
.center { text-align: center }
.right { text-align: right }
p { margin-top: 0; margin-bottom: 0; vertical-align: top }
pre { margin-top: 0; margin-bottom: 0; vertical-align: top }
table { margin-top: 0; margin-bottom: 0; vertical-align: top }
h1 { text-align: center }
</style>
<!-- Creator : groff version 1.22.4 -->
<!-- CreationDate: Tue Jul 2 20:08:09 2019 -->
<title></title>
</head>
<body>
... truncated
</body>
</html>

View File

@ -0,0 +1,47 @@
<html lang="en">
<head>
<title>Asm Mode - GNU Emacs Manual</title>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<meta name="description" content="GNU Emacs Manual">
<meta name="generator" content="makeinfo 4.8">
<link title="Top" rel="start" href="index.html#Top">
<link rel="up" href="Programs.html#Programs" title="Programs">
<link rel="prev" href="C-Modes.html#C-Modes" title="C Modes">
<link rel="next" href="Fortran.html#Fortran" title="Fortran">
<link href="http://www.gnu.org/software/texinfo/" rel="generator-home" title="Texinfo Homepage">
<!--
This is the `GNU Emacs Manual',
updated for Emacs version {No value for `EMACSVER'}.
Copyright (C) 1985--1987, 1993--2019 Free Software Foundation, Inc.
Permission is granted to copy, distribute and/or modify this
document under the terms of the GNU Free Documentation License,
Version 1.3 or any later version published by the Free Software
Foundation; with the Invariant Sections being ``The GNU
Manifesto,'' ``Distribution'' and ``GNU GENERAL PUBLIC LICENSE,''
with the Front-Cover Texts being ``A GNU Manual,'' and with the
Back-Cover Texts as in (a) below. A copy of the license is
included in the section entitled ``GNU Free Documentation
License.''
(a) The FSF's Back-Cover Text is: ``You have the freedom to copy
and modify this GNU manual. Buying copies from the FSF supports
it in developing GNU and promoting software freedom.''
-->
<meta http-equiv="Content-Style-Type" content="text/css">
<style type="text/css"><!--
pre.display { font-family:inherit }
pre.format { font-family:inherit }
pre.smalldisplay { font-family:inherit; font-size:smaller }
pre.smallformat { font-family:inherit; font-size:smaller }
pre.smallexample { font-size:smaller }
pre.smalllisp { font-size:smaller }
span.sc { font-variant:small-caps }
span.roman { font-family:serif; font-weight:normal; }
span.sansserif { font-family:sans-serif; font-weight:normal; }
--></style>
</head>
<body>
</body>
</html>

View File

@ -0,0 +1,40 @@
<!DOCTYPE html>
<html>
<!-- This is an automatically generated file. Do not edit.
$OpenBSD: mandoc.1,v 1.161 2019/02/23 18:52:45 schwarze Exp $
Copyright (c) 2009, 2010, 2011 Kristaps Dzonsons <kristaps@bsd.lv>
Copyright (c) 2012, 2014-2018 Ingo Schwarze <schwarze@openbsd.org>
Permission to use, copy, modify, and distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
-->
<head>
<meta charset="utf-8"/>
<style>
table.head, table.foot { width: 100%; }
td.head-rtitle, td.foot-os { text-align: right; }
td.head-vol { text-align: center; }
div.Pp { margin: 1ex 0ex; }
div.Nd, div.Bf, div.Op { display: inline; }
span.Pa, span.Ad { font-style: italic; }
span.Ms { font-weight: bold; }
dl.Bl-diag > dt { font-weight: bold; }
code.Nm, code.Fl, code.Cm, code.Ic, code.In, code.Fd, code.Fn,
code.Cd { font-weight: bold; font-family: inherit; }
</style>
<title>MANDOC(1)</title>
</head>
<body>
... document truncated
</body>
</html>

View File

@ -0,0 +1,4 @@
<!DOCTYPE html>
<html>
<meta name="generator" value="Some sick tool nobody's using yet" />
</html>

View File

@ -0,0 +1,73 @@
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<!--Converted with jLaTeX2HTML 2002-2-1 (1.70) JA patch-2.0
patched version by: Kenshi Muto, Debian Project.
* modified by: Shige TAKENO
LaTeX2HTML 2002-2-1 (1.70),
original version by: Nikos Drakos, CBLU, University of Leeds
* revised and updated by: Marcus Hennecke, Ross Moore, Herb Swan
* with significant contributions from:
Jens Lippmann, Marek Rouchal, Martin Wilck and others -->
<HTML>
<HEAD>
<TITLE>Quality/Complexity metric plugins</TITLE>
<META NAME="description" CONTENT="Quality/Complexity metric plugins">
<META NAME="keywords" CONTENT="main">
<META NAME="resource-type" CONTENT="document">
<META NAME="distribution" CONTENT="global">
<META NAME="Generator" CONTENT="jLaTeX2HTML v2002-2-1 JA patch-2.0">
<META HTTP-EQUIV="Content-Style-Type" CONTENT="text/css">
<LINK REL="STYLESHEET" HREF="main.css">
<LINK REL="next" HREF="node79.html">
<LINK REL="previous" HREF="node77.html">
<LINK REL="up" HREF="node74.html">
<LINK REL="next" HREF="node79.html">
</HEAD>
<BODY >
<!--Navigation Panel-->
<A NAME="tex2html1237"
HREF="node79.html">
<IMG WIDTH="37" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="next"
SRC="file:/usr/share/latex2html/icons/next.png"></A>
<A NAME="tex2html1233"
HREF="node74.html">
<IMG WIDTH="26" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="up"
SRC="file:/usr/share/latex2html/icons/up.png"></A>
<A NAME="tex2html1227"
HREF="node77.html">
<IMG WIDTH="63" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="previous"
SRC="file:/usr/share/latex2html/icons/prev.png"></A>
<A NAME="tex2html1235"
HREF="node1.html">
<IMG WIDTH="65" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="contents"
SRC="file:/usr/share/latex2html/icons/contents.png"></A>
<BR>
<B> Next:</B> <A NAME="tex2html1238"
HREF="node79.html">Time signal generator</A>
<B> Up:</B> <A NAME="tex2html1234"
HREF="node74.html">Additional simulator components</A>
<B> Previous:</B> <A NAME="tex2html1228"
HREF="node77.html">Weather scenario ()</A>
&nbsp; <B> <A NAME="tex2html1236"
HREF="node1.html">Contents</A></B>
<BR>
<BR>
<!--End of Navigation Panel-->
<H3><A NAME="SECTION000101400000000000000">
Quality/Complexity metric plugins</A>
</H3>
Various plugins to measure schedule quality and problem complexity to suit the application.
<P>
<BR><HR>
<ADDRESS>
Steve Fraser
2008-01-31
</ADDRESS>
</BODY>
</HTML>

View File

@ -0,0 +1,20 @@
<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" lang="en" xml:lang="en">
<head>
<!-- 2019-06-08 Sat 04:15 -->
<meta http-equiv="Content-Type" /
content="text/html;charset=utf-8" />
<meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<title>unpackaged.el</title>
<meta name="generator" content="Org mode" />
<meta name="author" content="Adam Porter" />
<style type="text/css">
<!--/*--><![CDATA[/*><!--*/
</style>
</head>
<body>
</body>
</html>

31
_testdata/HTML/pages.html Normal file
View File

@ -0,0 +1,31 @@
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<title>Related Pages</title>
<link href="qt.css" rel="stylesheet" type="text/css"/>
</head>
<body>
<div class=header>
<a class=headerLink href="index.html">Main Page</a> &middot;
<a class=headerLink href="classoverview.html">Class Overview</a> &middot;
<a class=headerLink href="hierarchy.html">Hierarchy</a> &middot;
<a class=headerLink href="annotated.html">All Classes</a>
</div>
<!-- Generated by Doxygen 1.8.1.2 -->
</div><!-- top -->
<div class="header">
<div class="headertitle">
<div class="title">Related Pages</div> </div>
</div><!--header-->
<div class="contents">
<div class="textblock">Here is a list of all related documentation pages:</div><div class="directory">
<table class="directory">
<tr id="row_0_" class="even"><td class="entry"><img src="ftv2node.png" alt="o" width="16" height="22" /><a class="el" href="classoverview.html" target="_self">Class Overview</a></td><td class="desc"></td></tr>
<tr id="row_1_"><td class="entry"><img src="ftv2lastnode.png" alt="\" width="16" height="22" /><a class="el" href="thelayoutsystem.html" target="_self">The Layout System</a></td><td class="desc"></td></tr>
</table>
</div><!-- directory -->
</div><!-- contents -->
<div class="footer" />Generated with <a href="http://www.doxygen.org/index.html">Doxygen</a> 1.8.1.2</div>
</body>
</html>

View File

@ -0,0 +1,10 @@
<!DOCTYPE html>
<html>
<head>
<meta
name ="generator"
content =
"Org mode"
>
</head>
</html>

View File

@ -0,0 +1,6 @@
<!DOCTYPE html>
<html>
<head>
<meta name = generator content = makeinfo />
</head>
</html>

View File

@ -0,0 +1,6 @@
<!DOCTYPE html>
<html>
<head>
<meta name = 'generator' content = 'Org mode'>
</head>
</html>

4
_testdata/HTML/ronn.html Normal file
View File

@ -0,0 +1,4 @@
<!DOCTYPE html>
<html>
<meta name="generator" value="Ronn/v0.7.3 (http://github.com/rtomayko/ronn/tree/0.7.3)" />
</html>

View File

@ -0,0 +1,4 @@
<!DOCTYPE html>
<html>
<meta name="generator" content="Some sick tool nobody's using yet" />
</html>

View File

@ -0,0 +1,8 @@
<!DOCTYPE html>
<html>
<head>
<META NAME= 'GENERATOR'
CONTENT
= 'ORG MODE'>
</head>
</html>

View File

@ -0,0 +1,15 @@
{
"id": "2ea73365-b6f1-4bd1-a454-d57a67e50684",
"modelName": "GMFolder",
"mvc": "1.1",
"name": "2ea73365-b6f1-4bd1-a454-d57a67e50684",
"children": [
"d74cdac8-2717-46a5-a8ed-5f732e02a268",
"b68cebe1-d0fa-44ed-8a26-595c2885a1fc",
"1e95eeaf-b11a-44a2-a1b0-bd201f6366ee"
],
"filterType": "GMSprite",
"folderName": "sprites",
"isDefaultView": false,
"localisedFolderName": "ResourceTree_Sprites"
}

View File

@ -0,0 +1,374 @@
{
"id": "62aa8308-837c-4613-a44e-a1306eee5c3c",
"modelName": "GMProject",
"mvc": "1.0",
"IsDnDProject": false,
"configs": [
],
"option_ecma": false,
"parentProject": {
"id": "18731a27-007b-4c7c-802b-b7eac446e27a",
"modelName": "GMProjectParent",
"mvc": "1.0",
"alteredResources": [
{
"Key": "ed6a955d-5826-4f98-a450-10b414266c27",
"Value": {
"configDeltas": [
"inherited"
],
"id": "a3c2778c-ba48-40f4-b180-c31dd0bf91c2",
"resourcePath": "options\\main\\options_main.yy",
"resourceType": "GMMainOptions"
}
}
],
"hiddenResources": [
],
"projectPath": "${base_project}"
},
"resources": [
{
"Key": "05f45e8d-a727-4650-b0c2-0ac5f793f164",
"Value": {
"id": "6f75d81a-3fe8-484f-8178-ce66c6b9b8cf",
"resourcePath": "views\\05f45e8d-a727-4650-b0c2-0ac5f793f164.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "128687b6-cf4b-4bad-bfec-52960e01a538",
"Value": {
"id": "ec6296c1-a5d9-4c21-b3e3-5d4363a2070c",
"resourcePath": "views\\128687b6-cf4b-4bad-bfec-52960e01a538.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "1e95eeaf-b11a-44a2-a1b0-bd201f6366ee",
"Value": {
"id": "b109c79d-b9d0-4d6a-8b13-e460efb93824",
"resourcePath": "views\\1e95eeaf-b11a-44a2-a1b0-bd201f6366ee.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "227a1abf-7147-4a8b-bb31-62a5d3a9a07e",
"Value": {
"id": "d2010bb7-ac3c-4717-a93e-edb7e447e257",
"resourcePath": "objects\\obj_king\\obj_king.yy",
"resourceType": "GMObject"
}
},
{
"Key": "297651c9-b7a7-4a65-97f1-8ceb3b64f65d",
"Value": {
"id": "ab37c354-5987-4384-ae57-4edb8c73110d",
"resourcePath": "views\\297651c9-b7a7-4a65-97f1-8ceb3b64f65d.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "2d0a83ba-80a2-40b2-8805-93347555ef97",
"Value": {
"id": "cb703663-2592-4c26-bfbb-a11a76390b27",
"resourcePath": "objects\\par_depthobject\\par_depthobject.yy",
"resourceType": "GMObject"
}
},
{
"Key": "2ea73365-b6f1-4bd1-a454-d57a67e50684",
"Value": {
"id": "1136167c-dc54-4c46-b98d-c616e004866b",
"resourcePath": "views\\2ea73365-b6f1-4bd1-a454-d57a67e50684.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "31ff7d32-f0cd-4e2e-a129-338e3e580a1a",
"Value": {
"id": "027854a7-69ad-4dd9-bb97-94cf1cd09c90",
"resourcePath": "sprites\\spr_shadow\\spr_shadow.yy",
"resourceType": "GMSprite"
}
},
{
"Key": "48b46420-2f90-4794-9064-b6fdfad99fbd",
"Value": {
"id": "843e68a6-9d39-4f4c-99b0-7e2beb280bdc",
"resourcePath": "views\\48b46420-2f90-4794-9064-b6fdfad99fbd.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "4c4a5f6b-851e-45f2-9d16-4e770693d0d6",
"Value": {
"id": "d27ca460-a737-425d-b658-fda4e232da2d",
"resourcePath": "views\\4c4a5f6b-851e-45f2-9d16-4e770693d0d6.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "53f34530-342f-440f-8cd3-3df3c5bfc232",
"Value": {
"id": "7189b8af-fafa-4643-b05a-1efec553c075",
"resourcePath": "views\\53f34530-342f-440f-8cd3-3df3c5bfc232.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "556893b0-c414-45f4-b592-5ae7021b4e74",
"Value": {
"id": "aec12608-534e-4e2c-ad54-c3996c929309",
"resourcePath": "objects\\depthsorter\\depthsorter.yy",
"resourceType": "GMObject"
}
},
{
"Key": "583b7bc3-11a0-419c-8028-17f69745ce53",
"Value": {
"id": "f6fd0fc5-a491-4c44-b798-e69b3ff043f7",
"resourcePath": "sprites\\spr_knight\\spr_knight.yy",
"resourceType": "GMSprite"
}
},
{
"Key": "5fd06095-db49-460e-9954-34d8113518c0",
"Value": {
"id": "043b6ae1-c67b-4478-8d4a-98c54da49c59",
"resourcePath": "views\\5fd06095-db49-460e-9954-34d8113518c0.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "62e5c66d-8c36-4bc8-af71-c4119d077b56",
"Value": {
"id": "57e7a582-a710-4c41-9549-aa504dc9b2b0",
"resourcePath": "sprites\\spr_player_idle\\spr_player_idle.yy",
"resourceType": "GMSprite"
}
},
{
"Key": "638c7030-5359-4b3c-8e90-60cb3fcb9704",
"Value": {
"id": "60e7c880-7135-4f49-adb6-92520a22ca65",
"resourcePath": "objects\\obj_greentree\\obj_greentree.yy",
"resourceType": "GMObject"
}
},
{
"Key": "78fc92d3-2675-43e1-a794-30066b3aea37",
"Value": {
"id": "7b9e43bd-7dbc-4b55-b3a5-88176788511a",
"resourcePath": "objects\\obj_knight\\obj_knight.yy",
"resourceType": "GMObject"
}
},
{
"Key": "82f217a6-9373-47d2-85d9-14ce23a512a0",
"Value": {
"id": "bf7cdb17-ee68-4006-b102-afa68538dd7e",
"resourcePath": "views\\82f217a6-9373-47d2-85d9-14ce23a512a0.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "83d63ce4-d03f-444b-b478-522ad5b314a9",
"Value": {
"id": "a927ec72-d381-4816-9d70-c9b78f3e9eb0",
"resourcePath": "views\\83d63ce4-d03f-444b-b478-522ad5b314a9.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "8a96eadc-2360-466a-ae4a-7a71b1cfda17",
"Value": {
"id": "f5608dcf-9a81-490d-9373-fa59fc44fdbf",
"resourcePath": "tilesets\\tileset_1\\tileset_1.yy",
"resourceType": "GMTileSet"
}
},
{
"Key": "8df8cbdc-c273-479a-83c9-496f2e62d294",
"Value": {
"id": "a74e5eb9-978f-47f3-aa50-20b7a3fa9baa",
"resourcePath": "views\\8df8cbdc-c273-479a-83c9-496f2e62d294.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "9c35e2b5-ef9d-468c-8d88-8e8636785ca0",
"Value": {
"id": "fe440907-a939-4974-83aa-853900a43472",
"resourcePath": "views\\9c35e2b5-ef9d-468c-8d88-8e8636785ca0.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "a09f41cd-a4f3-4818-9942-6c258e8a0b0e",
"Value": {
"id": "ae291c5c-f27e-44a2-a6a7-8192ba7a8297",
"resourcePath": "views\\a09f41cd-a4f3-4818-9942-6c258e8a0b0e.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "a9188620-a624-4a5a-83ae-a1b53faf038b",
"Value": {
"id": "3a6a5b33-9543-4fad-922d-403ffb3e8fa2",
"resourcePath": "options\\linux\\options_linux.yy",
"resourceType": "GMLinuxOptions"
}
},
{
"Key": "a9c4bc40-93a0-4449-a951-9bfc82428101",
"Value": {
"id": "6b8e239a-9414-498c-85ae-23d5f8c1cb90",
"resourcePath": "objects\\obj_collision\\obj_collision.yy",
"resourceType": "GMObject"
}
},
{
"Key": "aa4ccde8-d128-4dfe-a243-5efe2da9ff0d",
"Value": {
"id": "41ccb1a0-cf8e-4b28-a1fb-ce53f209bcd0",
"resourcePath": "views\\aa4ccde8-d128-4dfe-a243-5efe2da9ff0d.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "ac913661-4852-4ce1-89ce-3a031aed6119",
"Value": {
"id": "09ddc355-efe1-4b77-9431-3bc87009e880",
"resourcePath": "sprites\\spr_tileset1\\spr_tileset1.yy",
"resourceType": "GMSprite"
}
},
{
"Key": "acfe2407-bee7-4ce9-ac58-d3327085f21b",
"Value": {
"id": "201500e6-154c-4c05-8221-57fdd65f74e7",
"resourcePath": "sprites\\spr_player\\spr_player.yy",
"resourceType": "GMSprite"
}
},
{
"Key": "af01d96e-a301-41cc-9c82-eb60af19cbae",
"Value": {
"id": "34f0d902-d0aa-4ab8-8d02-9d91150b816c",
"resourcePath": "views\\af01d96e-a301-41cc-9c82-eb60af19cbae.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "b4fe38c4-c338-4d32-b85f-1b41d9a2446d",
"Value": {
"id": "ae19ee11-6b01-4f2b-8f55-adf5cb081961",
"resourcePath": "sprites\\spr_collision\\spr_collision.yy",
"resourceType": "GMSprite"
}
},
{
"Key": "b68cebe1-d0fa-44ed-8a26-595c2885a1fc",
"Value": {
"id": "1f6a859b-68fa-4564-bcb6-1fe387319eb4",
"resourcePath": "views\\b68cebe1-d0fa-44ed-8a26-595c2885a1fc.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "bd8b3db8-e5e6-44f1-bfcf-45d3242a0122",
"Value": {
"id": "4450def6-f270-4dd7-8896-c9d8ac34be9b",
"resourcePath": "sprites\\spr_scout\\spr_scout.yy",
"resourceType": "GMSprite"
}
},
{
"Key": "bffbcd96-a8c7-431f-af4f-7f66e8f84189",
"Value": {
"id": "bfee4d13-6e16-41f6-9fb7-095126af6fcd",
"resourcePath": "rooms\\rm_1\\rm_1.yy",
"resourceType": "GMRoom"
}
},
{
"Key": "cc3b80a7-5d09-4bea-b9c8-b093f1b244c8",
"Value": {
"id": "ac219fd8-ba00-4bff-857c-93d7cdb409c8",
"resourcePath": "views\\cc3b80a7-5d09-4bea-b9c8-b093f1b244c8.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "cc98d028-7bdd-4680-85f3-c87a7baa481e",
"Value": {
"id": "a84e7ce1-a928-4a86-bd79-8355f266e12e",
"resourcePath": "options\\windows\\options_windows.yy",
"resourceType": "GMWindowsOptions"
}
},
{
"Key": "d3e83dba-5be8-436f-866d-9a5a2120adb1",
"Value": {
"id": "ca2f0120-77ec-4bb4-8b0b-c12af28e2db0",
"resourcePath": "views\\d3e83dba-5be8-436f-866d-9a5a2120adb1.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "d74cdac8-2717-46a5-a8ed-5f732e02a268",
"Value": {
"id": "3b26d2bb-6a56-4a30-9c5a-47c3bcc83bb9",
"resourcePath": "views\\d74cdac8-2717-46a5-a8ed-5f732e02a268.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "e3e41b8b-b981-42ce-aab7-e6265764c7a8",
"Value": {
"id": "a24e4fd7-f4da-4da0-aff1-da1bb864c0b4",
"resourcePath": "views\\e3e41b8b-b981-42ce-aab7-e6265764c7a8.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "f1fbb653-bca3-4725-9e81-6a3556760d0c",
"Value": {
"id": "76dcd643-64fe-49d6-83ff-e15184b56150",
"resourcePath": "objects\\obj_scout\\obj_scout.yy",
"resourceType": "GMObject"
}
},
{
"Key": "f418569b-3bdd-4706-a0e4-364317f54032",
"Value": {
"id": "1292899d-f280-4f75-93b4-b2ad6099dd3d",
"resourcePath": "options\\mac\\options_mac.yy",
"resourceType": "GMMacOptions"
}
},
{
"Key": "f48fa589-53e5-4ce0-a010-c16b38192e31",
"Value": {
"id": "e950beb0-3503-4c3d-9cb9-8743c91d7c94",
"resourcePath": "views\\f48fa589-53e5-4ce0-a010-c16b38192e31.yy",
"resourceType": "GMFolder"
}
},
{
"Key": "fd46d5b1-fe49-4ecf-b809-0a2449808fbe",
"Value": {
"id": "09d46108-701a-4592-906f-bcb479008507",
"resourcePath": "sprites\\spr_tree\\spr_tree.yy",
"resourceType": "GMSprite"
}
}
],
"script_order": [
],
"tutorial": ""
}

View File

@ -0,0 +1,625 @@
// This is a generated file. Not intended for manual editing.
package org.intellij.grammar.parser;
import com.intellij.lang.PsiBuilder;
import com.intellij.lang.PsiBuilder.Marker;
import static org.intellij.grammar.psi.BnfTypes.*;
import static org.intellij.grammar.parser.GeneratedParserUtilBase.*;
import com.intellij.psi.tree.IElementType;
import com.intellij.lang.ASTNode;
import com.intellij.psi.tree.TokenSet;
import com.intellij.lang.PsiParser;
import com.intellij.lang.LightPsiParser;
@SuppressWarnings({"SimplifiableIfStatement", "UnusedAssignment"})
public class GrammarParser implements PsiParser, LightPsiParser {
public ASTNode parse(IElementType t, PsiBuilder b) {
parseLight(t, b);
return b.getTreeBuilt();
}
public void parseLight(IElementType t, PsiBuilder b) {
boolean r;
b = adapt_builder_(t, b, this, EXTENDS_SETS_);
Marker m = enter_section_(b, 0, _COLLAPSE_, null);
if (t == BNF_ATTR) {
r = attr(b, 0);
}
else if (t == BNF_ATTR_PATTERN) {
r = attr_pattern(b, 0);
}
else if (t == BNF_ATTR_VALUE) {
r = attr_value(b, 0);
}
else if (t == BNF_ATTRS) {
r = attrs(b, 0);
}
else if (t == BNF_CHOICE) {
r = choice(b, 0);
}
else if (t == BNF_EXPRESSION) {
r = expression(b, 0);
}
else if (t == BNF_LITERAL_EXPRESSION) {
r = literal_expression(b, 0);
}
else if (t == BNF_MODIFIER) {
r = modifier(b, 0);
}
else if (t == BNF_PAREN_EXPRESSION) {
r = paren_expression(b, 0);
}
else if (t == BNF_PREDICATE) {
r = predicate(b, 0);
}
else if (t == BNF_PREDICATE_SIGN) {
r = predicate_sign(b, 0);
}
else if (t == BNF_QUANTIFIED) {
r = quantified(b, 0);
}
else if (t == BNF_QUANTIFIER) {
r = quantifier(b, 0);
}
else if (t == BNF_REFERENCE_OR_TOKEN) {
r = reference_or_token(b, 0);
}
else if (t == BNF_RULE) {
r = rule(b, 0);
}
else if (t == BNF_SEQUENCE) {
r = sequence(b, 0);
}
else if (t == BNF_STRING_LITERAL_EXPRESSION) {
r = string_literal_expression(b, 0);
}
else {
r = parse_root_(t, b, 0);
}
exit_section_(b, 0, m, t, r, true, TRUE_CONDITION);
}
protected boolean parse_root_(IElementType t, PsiBuilder b, int l) {
return grammar(b, l + 1);
}
public static final TokenSet[] EXTENDS_SETS_ = new TokenSet[] {
create_token_set_(BNF_LITERAL_EXPRESSION, BNF_STRING_LITERAL_EXPRESSION),
create_token_set_(BNF_CHOICE, BNF_EXPRESSION, BNF_LITERAL_EXPRESSION, BNF_PAREN_EXPRESSION,
BNF_PREDICATE, BNF_QUANTIFIED, BNF_REFERENCE_OR_TOKEN, BNF_SEQUENCE,
BNF_STRING_LITERAL_EXPRESSION),
};
/* ********************************************************** */
// id attr_pattern? '=' attr_value ';'?
public static boolean attr(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attr")) return false;
boolean r, p;
Marker m = enter_section_(b, l, _NONE_, "<attr>");
r = consumeToken(b, BNF_ID);
p = r; // pin = 1
r = r && report_error_(b, attr_1(b, l + 1));
r = p && report_error_(b, consumeToken(b, BNF_OP_EQ)) && r;
r = p && report_error_(b, attr_value(b, l + 1)) && r;
r = p && attr_4(b, l + 1) && r;
exit_section_(b, l, m, BNF_ATTR, r, p, attr_recover_until_parser_);
return r || p;
}
// attr_pattern?
private static boolean attr_1(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attr_1")) return false;
attr_pattern(b, l + 1);
return true;
}
// ';'?
private static boolean attr_4(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attr_4")) return false;
consumeToken(b, BNF_SEMICOLON);
return true;
}
/* ********************************************************** */
// '(' string ')'
public static boolean attr_pattern(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attr_pattern")) return false;
if (!nextTokenIs(b, BNF_LEFT_PAREN)) return false;
boolean r;
Marker m = enter_section_(b);
r = consumeToken(b, BNF_LEFT_PAREN);
r = r && consumeToken(b, BNF_STRING);
r = r && consumeToken(b, BNF_RIGHT_PAREN);
exit_section_(b, m, BNF_ATTR_PATTERN, r);
return r;
}
/* ********************************************************** */
// !'}'
static boolean attr_recover_until(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attr_recover_until")) return false;
boolean r;
Marker m = enter_section_(b, l, _NOT_, null);
r = !consumeToken(b, BNF_RIGHT_BRACE);
exit_section_(b, l, m, null, r, false, null);
return r;
}
/* ********************************************************** */
// (reference_or_token | literal_expression) !'='
public static boolean attr_value(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attr_value")) return false;
boolean r;
Marker m = enter_section_(b, l, _NONE_, "<attr value>");
r = attr_value_0(b, l + 1);
r = r && attr_value_1(b, l + 1);
exit_section_(b, l, m, BNF_ATTR_VALUE, r, false, null);
return r;
}
// reference_or_token | literal_expression
private static boolean attr_value_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attr_value_0")) return false;
boolean r;
Marker m = enter_section_(b);
r = reference_or_token(b, l + 1);
if (!r) r = literal_expression(b, l + 1);
exit_section_(b, m, null, r);
return r;
}
// !'='
private static boolean attr_value_1(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attr_value_1")) return false;
boolean r;
Marker m = enter_section_(b, l, _NOT_, null);
r = !consumeToken(b, BNF_OP_EQ);
exit_section_(b, l, m, null, r, false, null);
return r;
}
/* ********************************************************** */
// '{' attr* '}'
public static boolean attrs(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attrs")) return false;
if (!nextTokenIs(b, BNF_LEFT_BRACE)) return false;
boolean r, p;
Marker m = enter_section_(b, l, _NONE_, null);
r = consumeToken(b, BNF_LEFT_BRACE);
p = r; // pin = 1
r = r && report_error_(b, attrs_1(b, l + 1));
r = p && consumeToken(b, BNF_RIGHT_BRACE) && r;
exit_section_(b, l, m, BNF_ATTRS, r, p, null);
return r || p;
}
// attr*
private static boolean attrs_1(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "attrs_1")) return false;
int c = current_position_(b);
while (true) {
if (!attr(b, l + 1)) break;
if (!empty_element_parsed_guard_(b, "attrs_1", c)) break;
c = current_position_(b);
}
return true;
}
/* ********************************************************** */
// '{' sequence ('|' sequence)* '}' | sequence choice_tail*
public static boolean choice(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "choice")) return false;
boolean r;
Marker m = enter_section_(b, l, _COLLAPSE_, "<choice>");
r = choice_0(b, l + 1);
if (!r) r = choice_1(b, l + 1);
exit_section_(b, l, m, BNF_CHOICE, r, false, null);
return r;
}
// '{' sequence ('|' sequence)* '}'
private static boolean choice_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "choice_0")) return false;
boolean r;
Marker m = enter_section_(b);
r = consumeToken(b, BNF_LEFT_BRACE);
r = r && sequence(b, l + 1);
r = r && choice_0_2(b, l + 1);
r = r && consumeToken(b, BNF_RIGHT_BRACE);
exit_section_(b, m, null, r);
return r;
}
// ('|' sequence)*
private static boolean choice_0_2(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "choice_0_2")) return false;
int c = current_position_(b);
while (true) {
if (!choice_0_2_0(b, l + 1)) break;
if (!empty_element_parsed_guard_(b, "choice_0_2", c)) break;
c = current_position_(b);
}
return true;
}
// '|' sequence
private static boolean choice_0_2_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "choice_0_2_0")) return false;
boolean r;
Marker m = enter_section_(b);
r = consumeToken(b, BNF_OP_OR);
r = r && sequence(b, l + 1);
exit_section_(b, m, null, r);
return r;
}
// sequence choice_tail*
private static boolean choice_1(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "choice_1")) return false;
boolean r;
Marker m = enter_section_(b);
r = sequence(b, l + 1);
r = r && choice_1_1(b, l + 1);
exit_section_(b, m, null, r);
return r;
}
// choice_tail*
private static boolean choice_1_1(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "choice_1_1")) return false;
int c = current_position_(b);
while (true) {
if (!choice_tail(b, l + 1)) break;
if (!empty_element_parsed_guard_(b, "choice_1_1", c)) break;
c = current_position_(b);
}
return true;
}
/* ********************************************************** */
// '|' sequence
static boolean choice_tail(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "choice_tail")) return false;
if (!nextTokenIs(b, BNF_OP_OR)) return false;
boolean r, p;
Marker m = enter_section_(b, l, _NONE_, null);
r = consumeToken(b, BNF_OP_OR);
p = r; // pin = 1
r = r && sequence(b, l + 1);
exit_section_(b, l, m, null, r, p, null);
return r || p;
}
/* ********************************************************** */
// choice?
public static boolean expression(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "expression")) return false;
Marker m = enter_section_(b, l, _COLLAPSE_, "<expression>");
choice(b, l + 1);
exit_section_(b, l, m, BNF_EXPRESSION, true, false, null);
return true;
}
/* ********************************************************** */
// (attrs | rule) *
static boolean grammar(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "grammar")) return false;
int c = current_position_(b);
while (true) {
if (!grammar_0(b, l + 1)) break;
if (!empty_element_parsed_guard_(b, "grammar", c)) break;
c = current_position_(b);
}
return true;
}
// attrs | rule
private static boolean grammar_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "grammar_0")) return false;
boolean r;
Marker m = enter_section_(b);
r = attrs(b, l + 1);
if (!r) r = rule(b, l + 1);
exit_section_(b, m, null, r);
return r;
}
/* ********************************************************** */
// string_literal_expression | number
public static boolean literal_expression(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "literal_expression")) return false;
if (!nextTokenIs(b, "<literal expression>", BNF_NUMBER, BNF_STRING)) return false;
boolean r;
Marker m = enter_section_(b, l, _COLLAPSE_, "<literal expression>");
r = string_literal_expression(b, l + 1);
if (!r) r = consumeToken(b, BNF_NUMBER);
exit_section_(b, l, m, BNF_LITERAL_EXPRESSION, r, false, null);
return r;
}
/* ********************************************************** */
// 'private' | 'external' | 'wrapped'
public static boolean modifier(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "modifier")) return false;
boolean r;
Marker m = enter_section_(b, l, _NONE_, "<modifier>");
r = consumeToken(b, "private");
if (!r) r = consumeToken(b, "external");
if (!r) r = consumeToken(b, "wrapped");
exit_section_(b, l, m, BNF_MODIFIER, r, false, null);
return r;
}
/* ********************************************************** */
// quantified | predicate
static boolean option(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "option")) return false;
boolean r;
Marker m = enter_section_(b);
r = quantified(b, l + 1);
if (!r) r = predicate(b, l + 1);
exit_section_(b, m, null, r);
return r;
}
/* ********************************************************** */
// '(' expression ')'
public static boolean paren_expression(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "paren_expression")) return false;
if (!nextTokenIs(b, BNF_LEFT_PAREN)) return false;
boolean r, p;
Marker m = enter_section_(b, l, _NONE_, null);
r = consumeToken(b, BNF_LEFT_PAREN);
p = r; // pin = 1
r = r && report_error_(b, expression(b, l + 1));
r = p && consumeToken(b, BNF_RIGHT_PAREN) && r;
exit_section_(b, l, m, BNF_PAREN_EXPRESSION, r, p, null);
return r || p;
}
/* ********************************************************** */
// predicate_sign simple
public static boolean predicate(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "predicate")) return false;
if (!nextTokenIs(b, "<predicate>", BNF_OP_NOT, BNF_OP_AND)) return false;
boolean r;
Marker m = enter_section_(b, l, _NONE_, "<predicate>");
r = predicate_sign(b, l + 1);
r = r && simple(b, l + 1);
exit_section_(b, l, m, BNF_PREDICATE, r, false, null);
return r;
}
/* ********************************************************** */
// '&' | '!'
public static boolean predicate_sign(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "predicate_sign")) return false;
if (!nextTokenIs(b, "<predicate sign>", BNF_OP_NOT, BNF_OP_AND)) return false;
boolean r;
Marker m = enter_section_(b, l, _NONE_, "<predicate sign>");
r = consumeToken(b, BNF_OP_AND);
if (!r) r = consumeToken(b, BNF_OP_NOT);
exit_section_(b, l, m, BNF_PREDICATE_SIGN, r, false, null);
return r;
}
/* ********************************************************** */
// '[' expression ']' | simple quantifier?
public static boolean quantified(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "quantified")) return false;
boolean r;
Marker m = enter_section_(b, l, _COLLAPSE_, "<quantified>");
r = quantified_0(b, l + 1);
if (!r) r = quantified_1(b, l + 1);
exit_section_(b, l, m, BNF_QUANTIFIED, r, false, null);
return r;
}
// '[' expression ']'
private static boolean quantified_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "quantified_0")) return false;
boolean r;
Marker m = enter_section_(b);
r = consumeToken(b, BNF_LEFT_BRACKET);
r = r && expression(b, l + 1);
r = r && consumeToken(b, BNF_RIGHT_BRACKET);
exit_section_(b, m, null, r);
return r;
}
// simple quantifier?
private static boolean quantified_1(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "quantified_1")) return false;
boolean r;
Marker m = enter_section_(b);
r = simple(b, l + 1);
r = r && quantified_1_1(b, l + 1);
exit_section_(b, m, null, r);
return r;
}
// quantifier?
private static boolean quantified_1_1(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "quantified_1_1")) return false;
quantifier(b, l + 1);
return true;
}
/* ********************************************************** */
// '?' | '+' | '*'
public static boolean quantifier(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "quantifier")) return false;
boolean r;
Marker m = enter_section_(b, l, _NONE_, "<quantifier>");
r = consumeToken(b, BNF_OP_OPT);
if (!r) r = consumeToken(b, BNF_OP_ONEMORE);
if (!r) r = consumeToken(b, BNF_OP_ZEROMORE);
exit_section_(b, l, m, BNF_QUANTIFIER, r, false, null);
return r;
}
/* ********************************************************** */
// id
public static boolean reference_or_token(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "reference_or_token")) return false;
if (!nextTokenIs(b, BNF_ID)) return false;
boolean r;
Marker m = enter_section_(b);
r = consumeToken(b, BNF_ID);
exit_section_(b, m, BNF_REFERENCE_OR_TOKEN, r);
return r;
}
/* ********************************************************** */
// modifier* id '::=' expression attrs? ';'?
public static boolean rule(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "rule")) return false;
boolean r, p;
Marker m = enter_section_(b, l, _NONE_, "<rule>");
r = rule_0(b, l + 1);
r = r && consumeToken(b, BNF_ID);
r = r && consumeToken(b, BNF_OP_IS);
p = r; // pin = 3
r = r && report_error_(b, expression(b, l + 1));
r = p && report_error_(b, rule_4(b, l + 1)) && r;
r = p && rule_5(b, l + 1) && r;
exit_section_(b, l, m, BNF_RULE, r, p, rule_recover_until_parser_);
return r || p;
}
// modifier*
private static boolean rule_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "rule_0")) return false;
int c = current_position_(b);
while (true) {
if (!modifier(b, l + 1)) break;
if (!empty_element_parsed_guard_(b, "rule_0", c)) break;
c = current_position_(b);
}
return true;
}
// attrs?
private static boolean rule_4(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "rule_4")) return false;
attrs(b, l + 1);
return true;
}
// ';'?
private static boolean rule_5(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "rule_5")) return false;
consumeToken(b, BNF_SEMICOLON);
return true;
}
/* ********************************************************** */
// !'{'
static boolean rule_recover_until(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "rule_recover_until")) return false;
boolean r;
Marker m = enter_section_(b, l, _NOT_, null);
r = !consumeToken(b, BNF_LEFT_BRACE);
exit_section_(b, l, m, null, r, false, null);
return r;
}
/* ********************************************************** */
// option +
public static boolean sequence(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "sequence")) return false;
boolean r;
Marker m = enter_section_(b, l, _COLLAPSE_, "<sequence>");
r = option(b, l + 1);
int c = current_position_(b);
while (r) {
if (!option(b, l + 1)) break;
if (!empty_element_parsed_guard_(b, "sequence", c)) break;
c = current_position_(b);
}
exit_section_(b, l, m, BNF_SEQUENCE, r, false, null);
return r;
}
/* ********************************************************** */
// !(modifier* id '::=' ) reference_or_token | literal_expression | paren_expression
static boolean simple(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "simple")) return false;
boolean r;
Marker m = enter_section_(b);
r = simple_0(b, l + 1);
if (!r) r = literal_expression(b, l + 1);
if (!r) r = paren_expression(b, l + 1);
exit_section_(b, m, null, r);
return r;
}
// !(modifier* id '::=' ) reference_or_token
private static boolean simple_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "simple_0")) return false;
boolean r;
Marker m = enter_section_(b);
r = simple_0_0(b, l + 1);
r = r && reference_or_token(b, l + 1);
exit_section_(b, m, null, r);
return r;
}
// !(modifier* id '::=' )
private static boolean simple_0_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "simple_0_0")) return false;
boolean r;
Marker m = enter_section_(b, l, _NOT_, null);
r = !simple_0_0_0(b, l + 1);
exit_section_(b, l, m, null, r, false, null);
return r;
}
// modifier* id '::='
private static boolean simple_0_0_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "simple_0_0_0")) return false;
boolean r;
Marker m = enter_section_(b);
r = simple_0_0_0_0(b, l + 1);
r = r && consumeToken(b, BNF_ID);
r = r && consumeToken(b, BNF_OP_IS);
exit_section_(b, m, null, r);
return r;
}
// modifier*
private static boolean simple_0_0_0_0(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "simple_0_0_0_0")) return false;
int c = current_position_(b);
while (true) {
if (!modifier(b, l + 1)) break;
if (!empty_element_parsed_guard_(b, "simple_0_0_0_0", c)) break;
c = current_position_(b);
}
return true;
}
/* ********************************************************** */
// string
public static boolean string_literal_expression(PsiBuilder b, int l) {
if (!recursion_guard_(b, l, "string_literal_expression")) return false;
if (!nextTokenIs(b, BNF_STRING)) return false;
boolean r;
Marker m = enter_section_(b);
r = consumeToken(b, BNF_STRING);
exit_section_(b, m, BNF_STRING_LITERAL_EXPRESSION, r);
return r;
}
final static Parser attr_recover_until_parser_ = new Parser() {
public boolean parse(PsiBuilder b, int l) {
return attr_recover_until(b, l + 1);
}
};
final static Parser rule_recover_until_parser_ = new Parser() {
public boolean parse(PsiBuilder b, int l) {
return rule_recover_until(b, l + 1);
}
};
}

View File

@ -0,0 +1,482 @@
/* The following code was generated by JFlex 1.4.3 on 28/01/16 11:27 */
package test;
import com.intellij.lexer.*;
import com.intellij.psi.tree.IElementType;
import static org.intellij.grammar.psi.BnfTypes.*;
/**
* This class is a scanner generated by
* <a href="http://www.jflex.de/">JFlex</a> 1.4.3
* on 28/01/16 11:27 from the specification file
* <tt>/home/abigail/code/intellij-grammar-kit-test/src/test/_GrammarLexer.flex</tt>
*/
public class _GrammarLexer implements FlexLexer {
/** initial size of the lookahead buffer */
private static final int ZZ_BUFFERSIZE = 16384;
/** lexical states */
public static final int YYINITIAL = 0;
/**
* ZZ_LEXSTATE[l] is the state in the DFA for the lexical state l
* ZZ_LEXSTATE[l+1] is the state in the DFA for the lexical state l
* at the beginning of a line
* l is of the form l = 2*k, k a non negative integer
*/
private static final int ZZ_LEXSTATE[] = {
0, 0
};
/**
* Translates characters to character classes
*/
private static final String ZZ_CMAP_PACKED =
"\11\0\1\1\1\1\1\0\1\1\1\1\22\0\1\1\101\0\1\13"+
"\1\0\1\3\1\14\1\0\1\10\1\0\1\2\3\0\1\12\1\7"+
"\3\0\1\6\1\4\1\5\1\11\uff8a\0";
/**
* Translates characters to character classes
*/
private static final char [] ZZ_CMAP = zzUnpackCMap(ZZ_CMAP_PACKED);
/**
* Translates DFA states to action switch labels.
*/
private static final int [] ZZ_ACTION = zzUnpackAction();
private static final String ZZ_ACTION_PACKED_0 =
"\1\0\1\1\1\2\3\1\1\3\10\0\1\4\1\5";
private static int [] zzUnpackAction() {
int [] result = new int[17];
int offset = 0;
offset = zzUnpackAction(ZZ_ACTION_PACKED_0, offset, result);
return result;
}
private static int zzUnpackAction(String packed, int offset, int [] result) {
int i = 0; /* index in packed string */
int j = offset; /* index in unpacked array */
int l = packed.length();
while (i < l) {
int count = packed.charAt(i++);
int value = packed.charAt(i++);
do result[j++] = value; while (--count > 0);
}
return j;
}
/**
* Translates a state to a row index in the transition table
*/
private static final int [] ZZ_ROWMAP = zzUnpackRowMap();
private static final String ZZ_ROWMAP_PACKED_0 =
"\0\0\0\15\0\32\0\47\0\64\0\101\0\15\0\116"+
"\0\133\0\150\0\165\0\202\0\217\0\234\0\251\0\15"+
"\0\15";
private static int [] zzUnpackRowMap() {
int [] result = new int[17];
int offset = 0;
offset = zzUnpackRowMap(ZZ_ROWMAP_PACKED_0, offset, result);
return result;
}
private static int zzUnpackRowMap(String packed, int offset, int [] result) {
int i = 0; /* index in packed string */
int j = offset; /* index in unpacked array */
int l = packed.length();
while (i < l) {
int high = packed.charAt(i++) << 16;
result[j++] = high | packed.charAt(i++);
}
return j;
}
/**
* The transition table of the DFA
*/
private static final int [] ZZ_TRANS = zzUnpackTrans();
private static final String ZZ_TRANS_PACKED_0 =
"\1\2\1\3\1\4\1\2\1\5\2\2\1\6\5\2"+
"\16\0\1\3\16\0\1\7\16\0\1\10\20\0\1\11"+
"\11\0\1\12\20\0\1\13\4\0\1\14\25\0\1\15"+
"\10\0\1\16\21\0\1\17\10\0\1\20\12\0\1\21"+
"\6\0";
private static int [] zzUnpackTrans() {
int [] result = new int[182];
int offset = 0;
offset = zzUnpackTrans(ZZ_TRANS_PACKED_0, offset, result);
return result;
}
private static int zzUnpackTrans(String packed, int offset, int [] result) {
int i = 0; /* index in packed string */
int j = offset; /* index in unpacked array */
int l = packed.length();
while (i < l) {
int count = packed.charAt(i++);
int value = packed.charAt(i++);
value--;
do result[j++] = value; while (--count > 0);
}
return j;
}
/* error codes */
private static final int ZZ_UNKNOWN_ERROR = 0;
private static final int ZZ_NO_MATCH = 1;
private static final int ZZ_PUSHBACK_2BIG = 2;
private static final char[] EMPTY_BUFFER = new char[0];
private static final int YYEOF = -1;
private static java.io.Reader zzReader = null; // Fake
/* error messages for the codes above */
private static final String ZZ_ERROR_MSG[] = {
"Unkown internal scanner error",
"Error: could not match input",
"Error: pushback value was too large"
};
/**
* ZZ_ATTRIBUTE[aState] contains the attributes of state <code>aState</code>
*/
private static final int [] ZZ_ATTRIBUTE = zzUnpackAttribute();
private static final String ZZ_ATTRIBUTE_PACKED_0 =
"\1\0\1\11\4\1\1\11\10\0\2\11";
private static int [] zzUnpackAttribute() {
int [] result = new int[17];
int offset = 0;
offset = zzUnpackAttribute(ZZ_ATTRIBUTE_PACKED_0, offset, result);
return result;
}
private static int zzUnpackAttribute(String packed, int offset, int [] result) {
int i = 0; /* index in packed string */
int j = offset; /* index in unpacked array */
int l = packed.length();
while (i < l) {
int count = packed.charAt(i++);
int value = packed.charAt(i++);
do result[j++] = value; while (--count > 0);
}
return j;
}
/** the current state of the DFA */
private int zzState;
/** the current lexical state */
private int zzLexicalState = YYINITIAL;
/** this buffer contains the current text to be matched and is
the source of the yytext() string */
private CharSequence zzBuffer = "";
/** this buffer may contains the current text array to be matched when it is cheap to acquire it */
private char[] zzBufferArray;
/** the textposition at the last accepting state */
private int zzMarkedPos;
/** the textposition at the last state to be included in yytext */
private int zzPushbackPos;
/** the current text position in the buffer */
private int zzCurrentPos;
/** startRead marks the beginning of the yytext() string in the buffer */
private int zzStartRead;
/** endRead marks the last character in the buffer, that has been read
from input */
private int zzEndRead;
/**
* zzAtBOL == true <=> the scanner is currently at the beginning of a line
*/
private boolean zzAtBOL = true;
/** zzAtEOF == true <=> the scanner is at the EOF */
private boolean zzAtEOF;
/* user code: */
public _GrammarLexer() {
this((java.io.Reader)null);
}
/**
* Creates a new scanner
*
* @param in the java.io.Reader to read input from.
*/
public _GrammarLexer(java.io.Reader in) {
this.zzReader = in;
}
/**
* Unpacks the compressed character translation table.
*
* @param packed the packed character translation table
* @return the unpacked character translation table
*/
private static char [] zzUnpackCMap(String packed) {
char [] map = new char[0x10000];
int i = 0; /* index in packed string */
int j = 0; /* index in unpacked array */
while (i < 52) {
int count = packed.charAt(i++);
char value = packed.charAt(i++);
do map[j++] = value; while (--count > 0);
}
return map;
}
public final int getTokenStart(){
return zzStartRead;
}
public final int getTokenEnd(){
return getTokenStart() + yylength();
}
public void reset(CharSequence buffer, int start, int end,int initialState){
zzBuffer = buffer;
zzBufferArray = com.intellij.util.text.CharArrayUtil.fromSequenceWithoutCopying(buffer);
zzCurrentPos = zzMarkedPos = zzStartRead = start;
zzPushbackPos = 0;
zzAtEOF = false;
zzAtBOL = true;
zzEndRead = end;
yybegin(initialState);
}
/**
* Refills the input buffer.
*
* @return <code>false</code>, iff there was new input.
*
* @exception java.io.IOException if any I/O-Error occurs
*/
private boolean zzRefill() throws java.io.IOException {
return true;
}
/**
* Returns the current lexical state.
*/
public final int yystate() {
return zzLexicalState;
}
/**
* Enters a new lexical state
*
* @param newState the new lexical state
*/
public final void yybegin(int newState) {
zzLexicalState = newState;
}
/**
* Returns the text matched by the current regular expression.
*/
public final CharSequence yytext() {
return zzBuffer.subSequence(zzStartRead, zzMarkedPos);
}
/**
* Returns the character at position <tt>pos</tt> from the
* matched text.
*
* It is equivalent to yytext().charAt(pos), but faster
*
* @param pos the position of the character to fetch.
* A value from 0 to yylength()-1.
*
* @return the character at position pos
*/
public final char yycharat(int pos) {
return zzBufferArray != null ? zzBufferArray[zzStartRead+pos]:zzBuffer.charAt(zzStartRead+pos);
}
/**
* Returns the length of the matched text region.
*/
public final int yylength() {
return zzMarkedPos-zzStartRead;
}
/**
* Reports an error that occured while scanning.
*
* In a wellformed scanner (no or only correct usage of
* yypushback(int) and a match-all fallback rule) this method
* will only be called with things that "Can't Possibly Happen".
* If this method is called, something is seriously wrong
* (e.g. a JFlex bug producing a faulty scanner etc.).
*
* Usual syntax/scanner level error handling should be done
* in error fallback rules.
*
* @param errorCode the code of the errormessage to display
*/
private void zzScanError(int errorCode) {
String message;
try {
message = ZZ_ERROR_MSG[errorCode];
}
catch (ArrayIndexOutOfBoundsException e) {
message = ZZ_ERROR_MSG[ZZ_UNKNOWN_ERROR];
}
throw new Error(message);
}
/**
* Pushes the specified amount of characters back into the input stream.
*
* They will be read again by then next call of the scanning method
*
* @param number the number of characters to be read again.
* This number must not be greater than yylength()!
*/
public void yypushback(int number) {
if ( number > yylength() )
zzScanError(ZZ_PUSHBACK_2BIG);
zzMarkedPos -= number;
}
/**
* Resumes scanning until the next regular expression is matched,
* the end of input is encountered or an I/O-Error occurs.
*
* @return the next token
* @exception java.io.IOException if any I/O-Error occurs
*/
public IElementType advance() throws java.io.IOException {
int zzInput;
int zzAction;
// cached fields:
int zzCurrentPosL;
int zzMarkedPosL;
int zzEndReadL = zzEndRead;
CharSequence zzBufferL = zzBuffer;
char[] zzBufferArrayL = zzBufferArray;
char [] zzCMapL = ZZ_CMAP;
int [] zzTransL = ZZ_TRANS;
int [] zzRowMapL = ZZ_ROWMAP;
int [] zzAttrL = ZZ_ATTRIBUTE;
while (true) {
zzMarkedPosL = zzMarkedPos;
zzAction = -1;
zzCurrentPosL = zzCurrentPos = zzStartRead = zzMarkedPosL;
zzState = ZZ_LEXSTATE[zzLexicalState];
zzForAction: {
while (true) {
if (zzCurrentPosL < zzEndReadL)
zzInput = (zzBufferArrayL != null ? zzBufferArrayL[zzCurrentPosL++] : zzBufferL.charAt(zzCurrentPosL++));
else if (zzAtEOF) {
zzInput = YYEOF;
break zzForAction;
}
else {
// store back cached positions
zzCurrentPos = zzCurrentPosL;
zzMarkedPos = zzMarkedPosL;
boolean eof = zzRefill();
// get translated positions and possibly new buffer
zzCurrentPosL = zzCurrentPos;
zzMarkedPosL = zzMarkedPos;
zzBufferL = zzBuffer;
zzEndReadL = zzEndRead;
if (eof) {
zzInput = YYEOF;
break zzForAction;
}
else {
zzInput = (zzBufferArrayL != null ? zzBufferArrayL[zzCurrentPosL++] : zzBufferL.charAt(zzCurrentPosL++));
}
}
int zzNext = zzTransL[ zzRowMapL[zzState] + zzCMapL[zzInput] ];
if (zzNext == -1) break zzForAction;
zzState = zzNext;
int zzAttributes = zzAttrL[zzState];
if ( (zzAttributes & 1) == 1 ) {
zzAction = zzState;
zzMarkedPosL = zzCurrentPosL;
if ( (zzAttributes & 8) == 8 ) break zzForAction;
}
}
}
// store back cached position
zzMarkedPos = zzMarkedPosL;
switch (zzAction < 0 ? zzAction : ZZ_ACTION[zzAction]) {
case 1:
{ return com.intellij.psi.TokenType.BAD_CHARACTER;
}
case 6: break;
case 4:
{ return BNF_STRING;
}
case 7: break;
case 5:
{ return BNF_NUMBER;
}
case 8: break;
case 3:
{ return BNF_ID;
}
case 9: break;
case 2:
{ return com.intellij.psi.TokenType.WHITE_SPACE;
}
case 10: break;
default:
if (zzInput == YYEOF && zzStartRead == zzCurrentPos) {
zzAtEOF = true;
return null;
}
else {
zzScanError(ZZ_NO_MATCH);
}
}
}
}
}

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,93 @@
(function(root, factory) {
if (typeof define === 'function' && define.amd) {
define(['lodash'], factory);
} else if (typeof exports !== 'undefined') {
module.exports = factory(require('lodash'));
} else {
root.Namespace = factory(root._);
}
})(this, function(_) {
'use strict';
/**
* @module namespace
* @class namespace
*/
function Namespace() {}
/**
* Regex for splitting keypaths into arrays.
*
* @private
* @const {RegExp}
* @type
*/
var KEYPATH_SPLITTER = /\./g;
/**
* An internal cache to avoid calculating a keypath more than once.
*
* @private
* @type {Object}
*/
var _keypaths = {};
_.extend(Namespace.prototype, {
/**
* Adds a definition to the namespace object.
*
* @public
* @instance
* @method add
* @param {String} keypath - The keypath for the definition to be added at.
* @param {Function|Object} definition - The definition to be added.
* @return {Function|Object} - The definition.
*/
add: function(keypath, definition) {
return this._walk(keypath, function(memo, name, index, keypath) {
if (index + 1 === keypath.length) {
memo[name] = _.extend(definition, memo[name]);
}
return memo[name] || (memo[name] = {});
});
},
/**
* Retrieves a definition from the namespace safely.
*
* @public
* @instance
* @method get
* @param {String} keypath - The keypath to lookup a definition for.
* @returns {Function|Object|undefined} - The definition if it exists, otherwise `undefined`.
*/
get: function(keypath) {
return this._walk(keypath);
},
/**
* An internal function for walking a keypath.
*
* @private
* @instance
* @method _walk
* @param {String} keypath - The keypath to walk through.
* @param {Function} [callback] - An optional callback to be called at each item in the path.
* @returns {function|Object|undefined} - The reduced keypath.
*/
_walk: function(keypath, callback) {
return _.reduce(
_keypaths[keypath] || (_keypaths[keypath] = keypath.split(KEYPATH_SPLITTER)),
callback || function(memo, name) {
return memo && memo[name];
},
this
);
}
});
return Namespace;
});
//# sourceMappingURL=namespace.js.map

View File

@ -0,0 +1,78 @@
%!PS-AdobeFont-1.0: Greek_Lambda_Character-Regular 020.017
%%Title: Greek_Lambda_Character-Regular
%Version: 020.017
%%CreationDate: Sun Jul 23 23:14:02 2017
%%Creator: John Gardner /ThatGuyLinguistWatchersAreSickOf
%Copyright: NONE. NADA. PUBLIC DOMAIN, BOI
% Generated by FontForge 20170719 (http://fontforge.sf.net/)
%%EndComments
10 dict begin
/FontType 1 def
/FontMatrix [0.000488281 0 0 0.000488281 0 0 ]readonly def
/FontName /Greek_Lambda_Character-Regular def
/FontBBox {68 -362 1158 1556 }readonly def
/PaintType 0 def
/FontInfo 11 dict dup begin
/version (020.017) readonly def
/Notice (NONE. NADA. PUBLIC DOMAIN, BOI) readonly def
/FullName (Greek_Lambda_Character Regular) readonly def
/FamilyName (Greek_Lambda_Character) readonly def
/Weight (Regular) readonly def
/FSType 8 def
/ItalicAngle 0 def
/isFixedPitch false def
/UnderlinePosition -175 def
/UnderlineThickness 90 def
/ascent 1556 def
end readonly def
/Encoding 256 array
0 1 255 { 1 index exch /.notdef put} for
dup 13/uni000D put
dup 32/space put
readonly def
currentdict end
currentfile eexec
743F8413F3636CA85A9FFEFB50B4BB27302A5F6C876586CCC1670A7EF5521E6ADE15AAB4
DD2DDDB83735311FC63DB80D2C96AECFA05BB67F865EA35934B4B79A203A8DD489B09C79
FF6EB9DBCFD889C3E73F8C94BC342AF671D6F688870A62EE1A0DF216E150FFEC64A8C2B7
509AD05C011599C1AD84E6C4B668E07EA219BD72663D8AF4CA8EC8E23AA90DE90BE940C6
6DB849CEDB3B64961365A7CCE47F4FC9E30FDEE4B14B90C2E0D8C344EBC974EABF417B3D
28251A78ACEE2BFC4212B1E3E9C7EBC3262821EE98E538713C64DF0BC13C19337B1307DE
427F2C8C4FF2126B9FF99DDDFB332AF3EBCD23349171C5C7E66B1C53EF39C35516C0DC7D
D6F70D59C28603519559B710C0128C22BA268D330AE02690534939E5B58BAAB00A59B0E6
E515E80A04F2B315A00FB1CA49A4B0259167BBA9852544092955FE0883DBCD89B739AA94
8C6277B54019E9870B44E5A998F0F2175545F602CC4630E4BE01E35F4CEFDD7D1B737358
1B2D6F612B23AFAF221AB18CD868E7548EE5064762971F8868940F26134CA7BB40734220
06EEBE10B5F5251FE36B0E794D8B0D7AC814F4C30BB5A0435C203748B40217BEAF506328
D30E9417382B228FF6DF97A71135B613CC3EF4CDD8276711842C3DB0D31A25883544FFF5
D8719874AD10816F3EB272D5C245440127B9F2967DC46EAF4BED77FF2C2585A355EB586E
09BE8BA3D104EA174EA004DCE00066F7606EEF55376E332566235B8F4283D63E2AE5F2E3
6A6940BED4FA83C6E82C444B2D790BAA36E641908C61F6409CA09D9723B2F169BE42B166
5E704AA69D594B2A22C631D030029B772C3C200A0A483A13403E2EF19274B73860FFEA37
521C65865A79A5429EED01C0F95AB6F66F030BF14CCB42A6DA2435B74135BB5D35EF6DC6
FE48D29690417795B9D7AA2A3D66B7A175F8B6C23AFB5A472343C2DFDCC242C27033DD6F
B65B1CD5DC29B9DA455FAAC3A9408BB9B61C2B701741C34CE88EBFF05B7045CBEAC8788E
548F0E642277D4F9AED30A3431070805910970243FD069130D641AFAE28754B1355618B8
706D163027980F2A76A59E0B9D2FB92BF17943129A8A6FB15CBF6D376C464329E24A118B
8E38987CDD4CD5F214B26D65BEEDB2ECD7502861747089298A653654062A3AD5AC4FC4AF
5DEF7BDD4C5CDBC828A1F12235F68522B45BE0C2D6FB663B51E76BF559815F41C25EC7F8
17A92B9E139F3E6D27A8512A261EE576459DE603F386516E46B8326574E900101A8D49C9
E8798E734F56C4E2D78B14D6B14E96F3523F3F34464235C96F94609852F90A0A1B51CE14
25EC76E65E5D1C5D2EEC730FC438F4083A806C67D0A1566D0E6EF70DC38E64DCA44DFA66
20073E848BDFDB2B793AFE29B47917D87CB8320F48A7C083CAAD2CA20C25D705D78B33D5
A5EAB8C671A57BBC3C896DE54C71B8F860B640B7C30199A2443353E3727FA99C168DB4E8
ABE021B7DDB526E30B2075028BD8D387D32E025E458D05E3A25B9BC94ED54F7A0FCA3D61
604D2F49A205CBDF5FBDD5D3E1D31C6F7310CC155982689FA1F5723DDAE2EDEB5E69F03C
4CCDBBCB348E57AF222744A07DCCE69236EB29499A87C0A201EDD6A402EA23C035732184
339049E8C101388304CD20EAA8554A9B6E4F2B126A0593F4401992E73F7BFEF89DE3DFDE
DA548E3EE05D5B4EE0D114F916
0000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000
cleartomark

72
_testdata/R/import.Rd Normal file
View File

@ -0,0 +1,72 @@
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/hello.R
\name{import}
\alias{import}
\title{Import a module into the current scope}
\usage{
import(module, attach, attach_operators = TRUE)
}
\arguments{
\item{module}{an identifier specifying the full module path}
\item{attach}{if \code{TRUE}, attach the newly loaded module to the object
search path (see \code{Details})}
\item{attach_operators}{if \code{TRUE}, attach operators of module to the
object search path, even if \code{attach} is \code{FALSE}}
}
\value{
the loaded module environment (invisible)
}
\description{
\code{module = import('module')} imports a specified module and makes its
code available via the environment-like object it returns.
}
\details{
Modules are loaded in an isolated environment which is returned, and
optionally attached to the object search path of the current scope (if
argument \code{attach} is \code{TRUE}).
\code{attach} defaults to \code{FALSE}. However, in interactive code it is
often helpful to attach packages by default. Therefore, in interactive code
invoked directly from the terminal only (i.e. not within modules),
\code{attach} defaults to the value of \code{options('import.attach')}, which
can be set to \code{TRUE} or \code{FALSE} depending on the users preference.
\code{attach_operators} causes \emph{operators} to be attached by default,
because operators can only be invoked in R if they re found in the search
path. Not attaching them therefore drastically limits a modules usefulness.
Modules are searched in the module search path \code{options('import.path')}.
This is a vector of paths to consider, from the highest to the lowest
priority. The current directory is \emph{always} considered first. That is,
if a file \code{a.r} exists both in the current directory and in a module
search path, the local file \code{./a.r} will be loaded.
Module names can be fully qualified to refer to nested paths. See
\code{Examples}.
}
\note{
Unlike for packages, attaching happens \emph{locally}: if
\code{import} is executed in the global environment, the effect is the same.
Otherwise, the imported module is inserted as the parent of the current
\code{environment()}. When used (globally) \emph{inside} a module, the newly
imported module is only available inside the modules search path, not
outside it (nor in other modules which might be loaded).
}
\examples{
# `a.r` is a file in the local directory containing a function `f`.
a = import('a')
a$f()
# b/c.r is a file in path `b`, containing a function `g`.
import('b/c', attach = TRUE)
g() # No module name qualification necessary
}
\seealso{
\code{unload}
\code{reload}
\code{module_name}
}

View File

@ -328,15 +328,13 @@ func getInterpreter(data []byte) (interpreter string) {
return
}
func getFirstLine(data []byte) []byte {
buf := bufio.NewScanner(bytes.NewReader(data))
buf.Scan()
line := buf.Bytes()
if err := buf.Err(); err != nil {
return nil
func getFirstLine(content []byte) []byte {
nlpos := bytes.IndexByte(content, '\n')
if nlpos < 0 {
return content
}
return line
return content[:nlpos]
}
func hasShebang(line []byte) bool {

806
data/generated.go Normal file
View File

@ -0,0 +1,806 @@
package data
import (
"bytes"
"strings"
"github.com/go-enry/go-enry/v2/regex"
)
// GeneratedCodeExtensions contains all extensions that belong to generated
// files for sure.
var GeneratedCodeExtensions = map[string]struct{}{
// XCode files
".nib": {},
".xcworkspacedata": {},
".xcuserstate": {},
}
// GeneratedCodeNameMatcher is a function that tells whether the file with the
// given name is generated.
type GeneratedCodeNameMatcher func(string) bool
func nameMatches(pattern string) GeneratedCodeNameMatcher {
r := regex.MustCompile(pattern)
return func(name string) bool {
return r.MatchString(name)
}
}
func nameContains(pattern string) GeneratedCodeNameMatcher {
return func(name string) bool {
return strings.Contains(name, pattern)
}
}
func nameEndsWith(pattern string) GeneratedCodeNameMatcher {
return func(name string) bool {
return strings.HasSuffix(name, pattern)
}
}
// GeneratedCodeNameMatchers are all the matchers that check whether the code
// is generated based only on the file name.
var GeneratedCodeNameMatchers = []GeneratedCodeNameMatcher{
// Cocoa pods
nameMatches(`(^Pods|\/Pods)\/`),
// Carthage build
nameMatches(`(^|\/)Carthage\/Build\/`),
// NET designer file
nameMatches(`(?i)\.designer\.(cs|vb)$`),
// Generated NET specflow feature file
nameEndsWith(".feature.cs"),
// Node modules
nameContains("node_modules/"),
// Go vendor
nameMatches(`vendor\/([-0-9A-Za-z]+\.)+(com|edu|gov|in|me|net|org|fm|io)`),
// Go lock
nameEndsWith("Gopkg.lock"),
nameEndsWith("glide.lock"),
// Esy lock
nameMatches(`(^|\/)(\w+\.)?esy.lock$`),
// NPM shrinkwrap
nameEndsWith("npm-shrinkwrap.json"),
// NPM package lock
nameEndsWith("package-lock.json"),
// Yarn plugnplay
nameMatches(`(^|\/)\.pnp\.(c|m)?js$`),
// Godeps
nameContains("Godeps/"),
// Composer lock
nameEndsWith("composer.lock"),
// Generated by zephir
nameMatches(`.\.zep\.(?:c|h|php)$`),
// Cargo lock
nameEndsWith("Cargo.lock"),
// Pipenv lock
nameEndsWith("Pipfile.lock"),
// GraphQL relay
nameContains("__generated__/"),
}
// GeneratedCodeMatcher checks whether the file with the given data is
// generated code.
type GeneratedCodeMatcher func(path, ext string, content []byte) bool
// GeneratedCodeMatchers is the list of all generated code matchers that
// rely on checking the content of the file to make the guess.
var GeneratedCodeMatchers = []GeneratedCodeMatcher{
isMinifiedFile,
hasSourceMapReference,
isSourceMap,
isCompiledCoffeeScript,
isGeneratedNetDocfile,
isGeneratedJavaScriptPEGParser,
isGeneratedPostScript,
isGeneratedGo,
isGeneratedProtobuf,
isGeneratedJavaScriptProtocolBuffer,
isGeneratedApacheThrift,
isGeneratedJNIHeader,
isVCRCassette,
isCompiledCythonFile,
isGeneratedModule,
isGeneratedUnity3DMeta,
isGeneratedRacc,
isGeneratedJFlex,
isGeneratedGrammarKit,
isGeneratedRoxygen2,
isGeneratedJison,
isGeneratedGRPCCpp,
isGeneratedDart,
isGeneratedPerlPPPortHeader,
isGeneratedGameMakerStudio,
isGeneratedGimp,
isGeneratedVisualStudio6,
isGeneratedHaxe,
isGeneratedHTML,
isGeneratedJooq,
}
func canBeMinified(ext string) bool {
return ext == ".js" || ext == ".css"
}
// isMinifiedFile returns whether the file may be minified.
// We consider a minified file any css or js file whose average number of chars
// per line is more than 110.
func isMinifiedFile(path, ext string, content []byte) bool {
if !canBeMinified(ext) {
return false
}
var chars, lines uint64
forEachLine(content, func(line []byte) {
chars += uint64(len(line))
lines++
})
if lines == 0 {
return false
}
return chars/lines > 110
}
var sourceMapRegex = regex.MustCompile(`^\/[*\/][\#@] source(?:Mapping)?URL|sourceURL=`)
// hasSourceMapReference returns whether the file contains a reference to a
// source-map file.
func hasSourceMapReference(_ string, ext string, content []byte) bool {
if !canBeMinified(ext) {
return false
}
for _, line := range getLines(content, -2) {
if sourceMapRegex.Match(line) {
return true
}
}
return false
}
var sourceMapRegexps = []regex.EnryRegexp{
regex.MustCompile(`^{"version":\d+,`),
regex.MustCompile(`^\/\*\* Begin line maps\. \*\*\/{`),
}
// isSourceMap returns whether the file itself is a source map.
func isSourceMap(path, _ string, content []byte) bool {
if strings.HasSuffix(path, ".js.map") || strings.HasSuffix(path, ".css.map") {
return true
}
firstLine := getLines(content, 1)[0]
for _, r := range sourceMapRegexps {
if r.Match(firstLine) {
return true
}
}
return false
}
func isCompiledCoffeeScript(path, ext string, content []byte) bool {
if ext != ".js" {
return false
}
firstLine := getLines(content, 1)[0]
lastLines := getLines(content, -2)
if string(firstLine) == "(function() {" &&
string(lastLines[1]) == "}).call(this);" &&
string(lastLines[0]) == "" {
score := 0
forEachLine(content, func(line []byte) {
if bytes.Contains(line, []byte("var ")) {
// Underscored temp vars are likely to be Coffee
score += 1 * countAppearancesInLine(line, "_fn", "_i", "_len", "_ref", "_results")
// bind and extend functions are very Coffee specific
score += 3 * countAppearancesInLine(line, "__bind", "__extends", "__hasProp", "__indexOf", "__slice")
}
})
// Require a score of 3. This is fairly abritrary. Consider tweaking later.
// See: https://github.com/github/linguist/blob/master/lib/linguist/generated.rb#L176-L213
return score >= 3
}
return false
}
func isGeneratedNetDocfile(_, ext string, content []byte) bool {
if ext != ".xml" {
return false
}
lines := bytes.Split(content, []byte{'\n'})
if len(lines) <= 3 {
return false
}
return bytes.Contains(lines[1], []byte("<doc>")) &&
bytes.Contains(lines[2], []byte("<assembly>")) &&
bytes.Contains(lines[len(lines)-2], []byte("</doc>"))
}
var pegJavaScriptGeneratedRegex = regex.MustCompile(`^(?:[^\/]|\/[^\*])*\/\*(?:[^\*]|\*[^\/])*Generated by PEG.js`)
func isGeneratedJavaScriptPEGParser(_, ext string, content []byte) bool {
if ext != ".js" {
return false
}
// PEG.js-generated parsers include a comment near the top of the file
// that marks them as such.
return pegJavaScriptGeneratedRegex.Match(bytes.Join(getLines(content, 5), []byte("")))
}
var postScriptType1And42Regex = regex.MustCompile(`(\n|\r\n|\r)\s*(?:currentfile eexec\s+|\/sfnts\s+\[)`)
var postScriptRegexes = []regex.EnryRegexp{
regex.MustCompile(`[0-9]|draw|mpage|ImageMagick|inkscape|MATLAB`),
regex.MustCompile(`PCBNEW|pnmtops|\(Unknown\)|Serif Affinity|Filterimage -tops`),
}
func isGeneratedPostScript(_, ext string, content []byte) bool {
if ext != ".ps" && ext != ".eps" && ext != ".pfa" {
return false
}
// Type 1 and Type 42 fonts converted to PostScript are stored as hex-encoded byte streams; these
// streams are always preceded the `eexec` operator (if Type 1), or the `/sfnts` key (if Type 42).
if postScriptType1And42Regex.Match(content) {
return true
}
// We analyze the "%%Creator:" comment, which contains the author/generator
// of the file. If there is one, it should be in one of the first few lines.
var creator []byte
for _, line := range getLines(content, 10) {
if bytes.HasPrefix(line, []byte("%%Creator: ")) {
creator = line
break
}
}
if len(creator) == 0 {
return false
}
// EAGLE doesn't include a version number when it generates PostScript.
// However, it does prepend its name to the document's "%%Title" field.
if bytes.Contains(creator, []byte("EAGLE")) {
for _, line := range getLines(content, 5) {
if bytes.HasPrefix(line, []byte("%%Title: EAGLE Drawing ")) {
return true
}
}
}
// Most generators write their version number, while human authors' or companies'
// names don't contain numbers. So look if the line contains digits. Also
// look for some special cases without version numbers.
for _, r := range postScriptRegexes {
if r.Match(creator) {
return true
}
}
return false
}
func isGeneratedGo(_, ext string, content []byte) bool {
if ext != ".go" {
return false
}
lines := getLines(content, 40)
if len(lines) <= 1 {
return false
}
for _, line := range lines {
if bytes.Contains(line, []byte("Code generated by")) {
return true
}
}
return false
}
var protoExtensions = map[string]struct{}{
".py": {},
".java": {},
".h": {},
".cc": {},
".cpp": {},
".m": {},
".rb": {},
".php": {},
}
func isGeneratedProtobuf(_, ext string, content []byte) bool {
if _, ok := protoExtensions[ext]; !ok {
return false
}
lines := getLines(content, 3)
if len(lines) <= 1 {
return false
}
for _, line := range lines {
if bytes.Contains(line, []byte("Generated by the protocol buffer compiler. DO NOT EDIT!")) {
return true
}
}
return false
}
func isGeneratedJavaScriptProtocolBuffer(_, ext string, content []byte) bool {
if ext != ".js" {
return false
}
lines := getLines(content, 6)
if len(lines) < 6 {
return false
}
return bytes.Contains(lines[5], []byte("GENERATED CODE -- DO NOT EDIT!"))
}
var apacheThriftExtensions = map[string]struct{}{
".rb": {},
".py": {},
".go": {},
".js": {},
".m": {},
".java": {},
".h": {},
".cc": {},
".cpp": {},
".php": {},
}
func isGeneratedApacheThrift(_, ext string, content []byte) bool {
if _, ok := apacheThriftExtensions[ext]; !ok {
return false
}
for _, line := range getLines(content, 6) {
if bytes.Contains(line, []byte("Autogenerated by Thrift Compiler")) {
return true
}
}
return false
}
func isGeneratedJNIHeader(_, ext string, content []byte) bool {
if ext != ".h" {
return false
}
lines := getLines(content, 2)
if len(lines) < 2 {
return false
}
return bytes.Contains(lines[0], []byte("/* DO NOT EDIT THIS FILE - it is machine generated */")) &&
bytes.Contains(lines[1], []byte("#include <jni.h>"))
}
func isVCRCassette(_, ext string, content []byte) bool {
if ext != ".yml" {
return false
}
lines := getLines(content, -2)
if len(lines) < 2 {
return false
}
return bytes.Contains(lines[1], []byte("recorded_with: VCR"))
}
func isCompiledCythonFile(_, ext string, content []byte) bool {
if ext != ".c" && ext != ".cpp" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return bytes.Contains(lines[0], []byte("Generated by Cython"))
}
func isGeneratedModule(_, ext string, content []byte) bool {
if ext != ".mod" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return bytes.Contains(lines[0], []byte("PCBNEW-LibModule-V")) ||
bytes.Contains(lines[0], []byte("GFORTRAN module version '"))
}
func isGeneratedUnity3DMeta(_, ext string, content []byte) bool {
if ext != ".meta" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return bytes.Contains(lines[0], []byte("fileFormatVersion: "))
}
func isGeneratedRacc(_, ext string, content []byte) bool {
if ext != ".rb" {
return false
}
lines := getLines(content, 3)
if len(lines) < 3 {
return false
}
return bytes.HasPrefix(lines[2], []byte("# This file is automatically generated by Racc"))
}
func isGeneratedJFlex(_, ext string, content []byte) bool {
if ext != ".java" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return bytes.HasPrefix(lines[0], []byte("/* The following code was generated by JFlex "))
}
func isGeneratedGrammarKit(_, ext string, content []byte) bool {
if ext != ".java" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return bytes.Contains(lines[0], []byte("// This is a generated file. Not intended for manual editing."))
}
func isGeneratedRoxygen2(_, ext string, content []byte) bool {
if ext != ".rd" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return bytes.Contains(lines[0], []byte("% Generated by roxygen2: do not edit by hand"))
}
func isGeneratedJison(_, ext string, content []byte) bool {
if ext != ".js" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return bytes.Contains(lines[0], []byte("/* parser generated by jison ")) ||
bytes.Contains(lines[0], []byte("/* generated by jison-lex "))
}
func isGeneratedGRPCCpp(_, ext string, content []byte) bool {
switch ext {
case ".cpp", ".hpp", ".h", ".cc":
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return bytes.Contains(lines[0], []byte("// Generated by the gRPC"))
default:
return false
}
}
var dartRegex = regex.MustCompile(`generated code\W{2,3}do not modify`)
func isGeneratedDart(_, ext string, content []byte) bool {
if ext != ".dart" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
return dartRegex.Match(bytes.ToLower(lines[0]))
}
func isGeneratedPerlPPPortHeader(name, _ string, content []byte) bool {
if !strings.HasSuffix(name, "ppport.h") {
return false
}
lines := getLines(content, 10)
if len(lines) < 10 {
return false
}
return bytes.Contains(lines[8], []byte("Automatically created by Devel::PPPort"))
}
var (
gameMakerStudioFirstLineRegex = regex.MustCompile(`^\d\.\d\.\d.+\|\{`)
gameMakerStudioThirdLineRegex = regex.MustCompile(`\"modelName\"\:\s*\"GM`)
)
func isGeneratedGameMakerStudio(_, ext string, content []byte) bool {
if ext != ".yy" && ext != ".yyp" {
return false
}
lines := getLines(content, 3)
if len(lines) < 3 {
return false
}
return gameMakerStudioThirdLineRegex.Match(lines[2]) ||
gameMakerStudioFirstLineRegex.Match(lines[0])
}
var gimpRegexes = []regex.EnryRegexp{
regex.MustCompile(`\/\* GIMP [a-zA-Z0-9\- ]+ C\-Source image dump \(.+?\.c\) \*\/`),
regex.MustCompile(`\/\* GIMP header image file format \([a-zA-Z0-9\- ]+\)\: .+?\.h \*\/`),
}
func isGeneratedGimp(_, ext string, content []byte) bool {
if ext != ".c" && ext != ".h" {
return false
}
lines := getLines(content, 1)
if len(lines) < 1 {
return false
}
for _, r := range gimpRegexes {
if r.Match(lines[0]) {
return true
}
}
return false
}
func isGeneratedVisualStudio6(_, ext string, content []byte) bool {
if ext != ".dsp" {
return false
}
for _, l := range getLines(content, 3) {
if bytes.Contains(l, []byte("# Microsoft Developer Studio Generated Build File")) {
return true
}
}
return false
}
var haxeExtensions = map[string]struct{}{
".js": {},
".py": {},
".lua": {},
".cpp": {},
".h": {},
".java": {},
".cs": {},
".php": {},
}
func isGeneratedHaxe(_, ext string, content []byte) bool {
if _, ok := haxeExtensions[ext]; !ok {
return false
}
for _, l := range getLines(content, 3) {
if bytes.Contains(l, []byte("Generated by Haxe")) {
return true
}
}
return false
}
var (
doxygenRegex = regex.MustCompile(`<!--\s+Generated by Doxygen\s+[.0-9]+\s*-->`)
htmlMetaRegex = regex.MustCompile(`<meta(\s+[^>]+)>`)
htmlMetaContentRegex = regex.MustCompile(`\s+(name|content|value)\s*=\s*("[^"]+"|'[^']+'|[^\s"']+)`)
orgModeMetaRegex = regex.MustCompile(`org\s+mode`)
)
func isGeneratedHTML(_, ext string, content []byte) bool {
if ext != ".html" && ext != ".htm" && ext != ".xhtml" {
return false
}
lines := getLines(content, 30)
// Pkgdown
for _, l := range lines[:2] {
if bytes.Contains(l, []byte("<!-- Generated by pkgdown: do not edit by hand -->")) {
return true
}
}
// Mandoc
if len(lines) > 2 &&
bytes.HasPrefix(lines[2], []byte("<!-- This is an automatically generated file.")) {
return true
}
// Doxygen
for _, l := range lines {
if doxygenRegex.Match(l) {
return true
}
}
// HTML tag: <meta name="generator" content="" />
part := bytes.ToLower(bytes.Join(lines, []byte{' '}))
part = bytes.ReplaceAll(part, []byte{'\n'}, []byte{})
part = bytes.ReplaceAll(part, []byte{'\r'}, []byte{})
matches := htmlMetaRegex.FindAll(part, -1)
if len(matches) == 0 {
return false
}
for _, m := range matches {
var name, value, content string
ms := htmlMetaContentRegex.FindAllStringSubmatch(string(m), -1)
for _, m := range ms {
switch m[1] {
case "name":
name = m[2]
case "value":
value = m[2]
case "content":
content = m[2]
}
}
var val = value
if val == "" {
val = content
}
name = strings.Trim(name, `"'`)
val = strings.Trim(val, `"'`)
if name != "generator" || val == "" {
continue
}
if strings.Contains(val, "jlatex2html") ||
strings.Contains(val, "latex2html") ||
strings.Contains(val, "groff") ||
strings.Contains(val, "makeinfo") ||
strings.Contains(val, "texi2html") ||
strings.Contains(val, "ronn") ||
orgModeMetaRegex.MatchString(val) {
return true
}
}
return false
}
func isGeneratedJooq(_, ext string, content []byte) bool {
if ext != ".java" {
return false
}
for _, l := range getLines(content, 2) {
if bytes.Contains(l, []byte("This file is generated by jOOQ.")) {
return true
}
}
return false
}
// getLines returns up to the first n lines. A negative index will return up to
// the last n lines in reverse order.
func getLines(content []byte, n int) [][]byte {
var result [][]byte
if n < 0 {
for pos := len(content); pos > 0 && len(result) < -n; {
nlpos := bytes.LastIndexByte(content[:pos], '\n')
if nlpos+1 < len(content)-1 {
result = append(result, content[nlpos+1:pos])
}
pos = nlpos
}
} else {
for pos := 0; pos < len(content) && len(result) < n; {
nlpos := bytes.IndexByte(content[pos:], '\n')
if nlpos < 0 && pos < len(content) {
nlpos = len(content)
} else if nlpos >= 0 {
nlpos += pos
}
result = append(result, content[pos:nlpos])
pos = nlpos + 1
}
}
return result
}
func forEachLine(content []byte, cb func([]byte)) {
var pos int
for pos < len(content) {
nlpos := bytes.IndexByte(content[pos:], '\n')
if nlpos < 0 && pos < len(content) {
nlpos = len(content)
} else if nlpos >= 0 {
nlpos += pos
}
cb(content[pos:nlpos])
pos = nlpos + 1
}
}
func countAppearancesInLine(line []byte, targets ...string) int {
var count int
for _, t := range targets {
count += bytes.Count(line, []byte(t))
}
return count
}

50
data/generated_test.go Normal file
View File

@ -0,0 +1,50 @@
package data
import (
"fmt"
"strings"
"testing"
"github.com/stretchr/testify/require"
)
func TestForEachLine(t *testing.T) {
const sample = "foo\nbar\nboomboom\nbleepbloop\n"
var lines = strings.Split(sample, "\n")
var result []string
forEachLine([]byte(sample), func(l []byte) {
result = append(result, string(l))
})
require.Equal(t, lines[:len(lines)-1], result)
}
func TestGetLines(t *testing.T) {
const sample = "foo\nbar\nboomboom\nbleepbloop\n"
testCases := []struct {
lines int
expected []string
}{
{1, []string{"foo"}},
{2, []string{"foo", "bar"}},
{10, []string{"foo", "bar", "boomboom", "bleepbloop"}},
{-1, []string{"bleepbloop"}},
{-2, []string{"bleepbloop", "boomboom"}},
{-10, []string{"bleepbloop", "boomboom", "bar", "foo"}},
}
for _, tt := range testCases {
t.Run(fmt.Sprint(tt.lines), func(t *testing.T) {
lines := getLines([]byte(sample), tt.lines)
var result = make([]string, len(lines))
for i, l := range lines {
result[i] = string(l)
}
require.Equal(t, tt.expected, result)
})
}
}

View File

@ -217,6 +217,17 @@ public class Enry {
return toJavaBool(nativeLib.IsVendor(toGoString(path)));
}
/**
* Reports whether the given file is a generated file.
*
* @param path of the file
* @param content of the file
* @return whether it's autogenerated or not
*/
public static synchronized boolean isGenerated(String path, byte[] content) {
return toJavaBool(nativeLib.IsGenerated(toGoString(path), toGoByteSlice(content)));
}
/**
* Returns a color code for given language.
*

View File

@ -51,6 +51,12 @@ def is_vendor(filename: str) -> bool:
guess = lib.IsVendor(fName)
return go_bool_to_py(guess)
def is_generated(filename: str, content: bytes) -> bool:
fname, c_str = py_str_to_go(filename)
fcontent, c_bytes = py_bytes_to_go(content)
guess = lib.IsGenerated(fname, fcontent)
return go_bool_to_py(guess)
## Tests
from collections import namedtuple

View File

@ -126,6 +126,11 @@ func IsVendor(path string) bool {
return enry.IsVendor(path)
}
//export IsGenerated
func IsGenerated(path string, content []byte) bool {
return enry.IsGenerated(path, content)
}
//export GetColor
func GetColor(language string) string {
return enry.GetColor(language)

View File

@ -11,8 +11,13 @@ import (
const binSniffLen = 8000
var configurationLanguages = map[string]bool{
"XML": true, "JSON": true, "TOML": true, "YAML": true, "INI": true, "SQL": true,
var configurationLanguages = map[string]struct{}{
"XML": {},
"JSON": {},
"TOML": {},
"YAML": {},
"INI": {},
"SQL": {},
}
// IsConfiguration tells if filename is in one of the configuration languages.
@ -102,3 +107,27 @@ func matchRegexSlice(exprs []regex.EnryRegexp, str string) bool {
return false
}
// IsGenerated returns whether the file with the given path and content is a
// generated file.
func IsGenerated(path string, content []byte) bool {
ext := strings.ToLower(filepath.Ext(path))
if _, ok := data.GeneratedCodeExtensions[ext]; ok {
return true
}
for _, m := range data.GeneratedCodeNameMatchers {
if m(path) {
return true
}
}
path = strings.ToLower(path)
for _, m := range data.GeneratedCodeMatchers {
if m(path, ext, content) {
return true
}
}
return false
}

View File

@ -3,9 +3,12 @@ package enry
import (
"bytes"
"fmt"
"io/ioutil"
"path/filepath"
"testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func TestIsVendor(t *testing.T) {
@ -192,3 +195,171 @@ func TestGetColor(t *testing.T) {
assert.Equal(t, test.expected, color, fmt.Sprintf("%v: is = %v, expected: %v", test.name, color, test.expected))
}
}
func TestIsGenerated(t *testing.T) {
testCases := []struct {
file string
load bool
generated bool
}{
// Xcode project files
{"Binary/MainMenu.nib", false, true},
{"Dummy/foo.xcworkspacedata", false, true},
{"Dummy/foo.xcuserstate", false, true},
//Cocoapods
{"Pods/Pods.xcodeproj", false, true},
{"Pods/SwiftDependency/foo.swift", false, true},
{"Pods/ObjCDependency/foo.h", false, true},
{"Pods/ObjCDependency/foo.m", false, true},
{"Dummy/Pods/Pods.xcodeproj", false, true},
{"Dummy/Pods/SwiftDependency/foo.swift", false, true},
{"Dummy/Pods/ObjCDependency/foo.h", false, true},
{"Dummy/Pods/ObjCDependency/foo.m", false, true},
//Carthage
{"Carthage/Build/.Dependency.version", false, true},
{"Carthage/Build/iOS/Dependency.framework", false, true},
{"Carthage/Build/Mac/Dependency.framework", false, true},
{"src/Carthage/Build/.Dependency.version", false, true},
{"src/Carthage/Build/iOS/Dependency.framework", false, true},
{"src/Carthage/Build/Mac/Dependency.framework", false, true},
//Go-specific vendored paths
{"go/vendor/github.com/foo.go", false, true},
{"go/vendor/golang.org/src/foo.c", false, true},
{"go/vendor/gopkg.in/some/nested/path/foo.go", false, true},
//.NET designer file
{"Dummy/foo.designer.cs", false, true},
{"Dummy/foo.Designer.cs", false, true},
{"Dummy/foo.designer.vb", false, true},
{"Dummy/foo.Designer.vb", false, true},
//Composer generated composer.lock file
{"JSON/composer.lock", false, true},
//Node modules
{"Dummy/node_modules/foo.js", false, true},
//npm shrinkwrap file
{"Dummy/npm-shrinkwrap.json", false, true},
{"Dummy/package-lock.json", false, true},
{"JavaScript/jquery-1.6.1.min.js", true, true},
//Yarn Plug'n'Play file
{".pnp.js", false, true},
{".pnp.cjs", false, true},
{".pnp.mjs", false, true},
//Godep saved dependencies
{"Godeps/Godeps.json", false, true},
{"Godeps/_workspace/src/github.com/kr/s3/sign.go", false, true},
//Generated by Zephir
{"C/exception.zep.c", false, true},
{"C/exception.zep.h", false, true},
{"PHP/exception.zep.php", false, true},
//Minified files
{"JavaScript/jquery-1.6.1.min.js", true, true},
//JavaScript with source-maps
{"JavaScript/namespace.js", true, true},
{"Generated/inline.js", true, true},
//CSS with source-maps
{"Generated/linked.css", true, true},
{"Generated/inline.css", true, true},
//Source-map
{"Data/bootstrap.css.map", true, true},
{"Generated/linked.css.map", true, true},
{"Data/sourcemap.v3.map", true, true},
{"Data/sourcemap.v1.map", true, true},
//Specflow
{"Features/BindingCulture.feature.cs", false, true},
//JFlex
{"Java/JFlexLexer.java", true, true},
//GrammarKit
{"Java/GrammarKit.java", true, true},
//roxygen2
{"R/import.Rd", true, true},
//PostScript
{"PostScript/lambda.pfa", true, true},
//Perl ppport.h
{"Generated/ppport.h", true, true},
//Graphql Relay
{"Javascript/__generated__/App_user.graphql.js", false, true},
//Game Maker Studio 2
{"JSON/GMS2_Project.yyp", true, true},
{"JSON/2ea73365-b6f1-4bd1-a454-d57a67e50684.yy", true, true},
{"Generated/options_main.inherited.yy", true, true},
//Pipenv
{"Dummy/Pipfile.lock", false, true},
//HTML
{"HTML/attr-swapped.html", true, true},
{"HTML/extra-attr.html", true, true},
{"HTML/extra-spaces.html", true, true},
{"HTML/extra-tags.html", true, true},
{"HTML/grohtml.html", true, true},
{"HTML/grohtml.xhtml", true, true},
{"HTML/makeinfo.html", true, true},
{"HTML/mandoc.html", true, true},
{"HTML/node78.html", true, true},
{"HTML/org-mode.html", true, true},
{"HTML/quotes-double.html", true, true},
{"HTML/quotes-none.html", true, true},
{"HTML/quotes-single.html", true, true},
{"HTML/uppercase.html", true, true},
{"HTML/ronn.html", true, true},
{"HTML/unknown.html", true, false},
{"HTML/no-content.html", true, false},
{"HTML/pages.html", true, true},
//GIMP
{"C/image.c", true, true},
{"C/image.h", true, true},
//Haxe
{"Generated/Haxe/main.js", true, true},
{"Generated/Haxe/main.py", true, true},
{"Generated/Haxe/main.lua", true, true},
{"Generated/Haxe/Main.cpp", true, true},
{"Generated/Haxe/Main.h", true, true},
{"Generated/Haxe/Main.java", true, true},
{"Generated/Haxe/Main.cs", true, true},
{"Generated/Haxe/Main.php", true, true},
}
for _, tt := range testCases {
t.Run(tt.file, func(t *testing.T) {
var content []byte
if tt.load {
var err error
content, err = ioutil.ReadFile(filepath.Join("_testdata", tt.file))
require.NoError(t, err)
}
result := IsGenerated(tt.file, content)
require.Equal(t, tt.generated, result)
})
}
}
func TestFoo(t *testing.T) {
file := "HTML/uppercase.html"
content, err := ioutil.ReadFile("_testdata/" + file)
require.NoError(t, err)
require.True(t, IsGenerated(file, content))
}