Summary:
The `"PASS"` label only makes sense for tests. This commit makes the
labels configurable, so that the verbiage can make more sense in other
contexts, too.
Test Plan:
Apply a patch like
```diff
diff --git a/config/travis.js b/config/travis.js
index af0996b..b0ab3b6 100644
--- a/config/travis.js
+++ b/config/travis.js
@@ -10,7 +10,11 @@ function main() {
process.argv.includes("--full")
? "FULL"
: "BASIC";
- execDependencyGraph(makeTasks(mode)).then(({success}) => {
+ execDependencyGraph(makeTasks(mode), {
+ taskLaunchLabel: " YO ",
+ taskPassLabel: "WHEE",
+ taskFailLabel: "UHOH",
+ }).then(({success}) => {
process.exitCode = success ? 0 : 1;
});
}
```
and note that `GITHUB_TOKEN=none yarn travis --full` exhibits all
desired messages.
wchargin-branch: configurable-labels
Summary:
This commit implements the `sourcecred` command-line utility, which has
three subcommands:
- `plugin-graph` creates one plugin’s graph;
- `combine` combines multiple on-disk graphs; and
- `graph` creates all plugins’ graphs and combines them.
As an implementation detail, the `into.sh` script is very convenient,
avoiding needing to do any pipe management in Node (which is Not Fun).
When we build for release, we may want to factor that differently.
Test Plan:
To see it all in action, run `yarn backend`, and then try:
```
$ export SOURCECRED_GITHUB_TOKEN="your_token_here"
$ node ./bin/sourcecred.js graph sourcecred sourcecred
Using output directory: /tmp/sourcecred/sourcecred
Starting tasks
GO create-git
GO create-github
PASS create-github
PASS create-git
GO combine
PASS combine
Full results
PASS create-git
PASS create-github
PASS combine
Overview
Final result: SUCCESS
$ ls /tmp/sourcecred/sourcecred/
graph-github.json graph-git.json graph.json
$ jq '.nodes | length' /tmp/sourcecred/sourcecred/*.json
1000
7302
8302
```
The `node sourcecred.js graph` command takes 9.8s for me.
(The salient point of the last command is that the two small graphs have
node count adding up to the node count of the big graph. Incidentally,
we are [almost][1] at a nice round number of nodes in the GitHub graph.)
[1]: https://xkcd.com/1000/
wchargin-branch: cli
Summary:
To be honest, I have no idea what exactly this does or why it’s
necessary, but if we don’t do this then it is not possible to `import`
the exported member from a Webpack-bundled script. I’ve seen this
pattern before; one day I’ll actually figure out what it does. :-)
Test Plan:
Note that `yarn travis` (success) and `yarn travis --full` (failure; no
GitHub token) both have the expected behaviors.
wchargin-branch: execdependencygraph-export
This commit adds [oclif] as a command-line framework. It is successfully
integrated with webpack.
[oclif]: https://github.com/oclif/oclif
Usage:
`yarn backend` to build the cli.
`node bin/sourcecred.js` to launch the CLI and see usage
`node bin/sourcecred.js example` for one example command
`node bin/sourcecred.js goodbye` for another example command
Summary:
Consequently, Babel won’t transform classes to their roughly equivalent
ES5 counterparts, etc.
Test Plan:
Create `src/classy.js` with `class X {}; console.log(X);`. Then, add a
build target for `classy: resolveApp("src/classy.js"),` in `paths.js`.
Use `yarn backend` and inspect the contents of `bin/classy.js`; in
particular, look at the definition of `X` (whatever the argument to
`console.log` is). Before this commit, the result will be a big
complicated mess. After this commit, it will be `class X {}`.
Note also that `yarn travis --full` passes, indicating that the two
manual tests, which call out to the utilities in `bin/`, still work.
wchargin-branch: target-node
Summary:
We want to change this configuration so that our compilation of backend
applications can target latest Node. This commit forks the current
configuration so that we can modify it easily.
Test Plan:
Both `yarn start` and `yarn travis` work. The generated backend
applications work, too.
wchargin-branch: fork-babel-config
Setup following directions from [webpack-node-externals]
[webpack-node-externals]: https://www.npmjs.com/package/webpack-node-externals
This unblocks #210.
Test plan: `yarn backend` still succeeds, and the binary scripts still
work. The resultant binaries are much smaller, as seen below (note build
time is the same).
before:
```
❯ yarn backend
yarn run v1.5.1
$ node scripts/backend.js
Building backend applications...
Compiled successfully.
File sizes after gzip:
231.37 KB bin/printCombinedGraph.js
199.5 KB bin/fetchAndPrintGithubRepo.js
46.41 KB bin/cloneAndPrintGitGraph.js
21.48 KB bin/createExampleRepo.js
17.71 KB bin/loadAndPrintGitRepository.js
Build completed; results in 'bin'.
Done in 4.46s.
```
after:
```
❯ yarn backend
yarn run v1.5.1
$ node scripts/backend.js
Building backend applications...
Compiled successfully.
File sizes after gzip:
27.78 KB bin/printCombinedGraph.js
12.73 KB bin/cloneAndPrintGitGraph.js
12.41 KB bin/fetchAndPrintGithubRepo.js
6.03 KB bin/loadAndPrintGitRepository.js
5.52 KB bin/createExampleRepo.js
Build completed; results in 'bin'.
Done in 4.28s.
```
Summary:
We’d like to use the same abstraction for creating multiple cred graphs
and then combining them together. This will enable us to do that.
Test Plan:
Run `yarn travis` to test the success case, and `yarn travis --full`
(without setting a `GITHUB_TOKEN`) to test the failure case.
wchargin-branch: execdepgraph
`printCombinedGraph` loads and prints a cross-plugin combined
contribution graph for a given GitHub repository.
It is a simple executable wrapper around `src/tools/loadCombinedGraph`.
Example usage:
`node bin/printCombinedGraph.js sourcecred example-git $GITHUB_TOKEN`
`cloneAndPrintGitGraph` clones a git repository, and generates a Git
object graph for that repository.
This can be run as follows:
```
yarn backend;
node bin/cloneAndPrintGitGraph sourcecred example-git
```
This commit also adds two utility modules:
* `cloneAndLoadRepository` , which clones a Git repository to a tmpdir,
parses the `Repository` data out, and then cleans up.
* `cloneGitGraph`, which calls `cloneAndLoadRepository` and `createGraph`
Test plan: These don't fit well into our CI, because they require
network access to clone repositories from GitHub. I verified that the
functions work via the demo script above.
fetchGithubGraph is a tiny module which is responsible for fetching
GitHub contribution data, and parsing it into a graph.
Test plan:
The function is trivial and does not merit explicit testing.
Summary:
If a commit causes a tree entry to change hash while keeping the same
name, we now add a BECOMES edge between the corresponding entries.
Test Plan:
Snapshot changes are readable enough to manually verify. Programmatic
tests also added.
wchargin-branch: graph-becomes-edges
Test Plan:
For the snapshot: verify that two of the BECOMES edges are the same as
those tested in `findBecomesEdgesForCommits` and that they have the
right commit hashes; then, verify that the remaining edge is correct.
wchargin-branch: high-level-becomes-repository
This adds MERGED_AS edges which link from a PullRequest to a Commit. It
adds a corresponding `mergedCommitHash` method on the porcelain PR that
returns the hash of the merged commit (if available).
I would have preferred to return a porcelain wrapper over the commit,
but since we don't have a porcelain Git api, it seemed preferrable to
return the hash as a string. Returning a Node would both break
consistency in the porcelain api, and be problematic as the node does
not necessarily exist in the api. To ensure that the hash is available
without parsing Addresses, I used the edge payload. :)
Test plan:
Inspect the snapshot changes in the graph (they are fairly readable) and
the api testing in api.test.js.
This commit adds a few fields to the PullRequest query fragment so that
we now retrieve merge commit shas. In cases where there is no merge
commit (ie the PR did not merge), the field is null. Observe that this
is the case for our example unmerged pull request.
Test plan: Inspect the changes to the demo data, and verify that they
are appropriate.
For the GitHub plugin to create edges pointing to commits from the Git
plugin, it needs a way to create the appropriate address given the
commit's hash. This commit exposes that functionality by moving
`makeAddress` out of the "createGraph" module and into a new "address"
module, and using it to implement `commitAddress`.
Test plan: The code is so trivial that I don't think it merits testing.
Our repository containing example GitHub data has been renamed from
"sourcecred/example-repo" to "sourcecred/example-github". This commit
updates all snapshots and tests to reflect this rename.
See [#190] for context.
The change is almost entirely straightforward; the only "interesting"
decision I made was to move the repo owner and repo name into the string
id for the Artifact Plugin addresses, as the id would otherwise not be
unique.
[#190]: https://github.com/sourcecred/sourcecred/issues/190#issuecomment-386362870
Summary:
Previously, a tree entry had exactly one `HAS_CONTENTS` edge, unless the
tree entry corresponded to a submodule commit, in which case we had no
information at all. Now, submodule commit tree entries point to zero or
more `SUBMODULE_COMMIT` nodes. In the vast majority of cases, there will
be exactly one such node—however, it is possible that a particular tree
entry could correspond to multiple submodules (clone two identical
submodules with different URLs) or none at all (some manually edited
`.gitmodules` or other corruption).
Test Plan:
The snapshot changes are easy enough to read and verify (two new nodes
and five new edges). Additionally, the path-to-submodule `createGraph`
test has been updated.
wchargin-branch: graph-submodule-urls
Summary:
In Git, a tree may point to a commit directly. In our graph, we’d like
to represent “submodule commits” explicitly, because, a priori, we do
not know the repository to which the commit belongs. A submodule commit
node will store the hash of the referent commit, as well as the URL to
the subproject as listed in the .gitmodules file. In this commit, we
load the list of those URLs into the in-memory repository.
Shout-out to `git` for having an excellent command-line API:
the `--blob` argument to `git-config` is perfect for this situation.
Test Plan:
Snapshot updates are readable and sufficient.
wchargin-branch: load-submodule-urls
Summary:
Prior to this commit, given a `Tree` node with an edge to a `TreeEntry`
node, there was no way to tell what the entry name was other than
parsing the ID (which should never be required). This adds appropriate
data to the payload of a `TreeEntry`, and also to the inclusion edge (so
that if you only have the edge, you don’t have to fetch the entry).
Test Plan:
Snapshot changes are readable.
wchargin-branch: treeentry-metadata
Summary:
This could catch failures in build configuration or with Webpack. It’s
unlikely to catch any logic errors, because no production code is run.
In any case, it’s fast enough; it finishes at about the same time as
`ci-test` and `check-pretty`.
Test Plan:
From the repository root, run `rm -r bin; yarn travis`, and note that
the `bin/` directory is regenerated.
wchargin-branch: ci-backend
Summary:
This prevents the boilerplate output of the form
```
> sourcecred-explorer@0.1.0 check-pretty /home/wchargin/git/sourcecred
> prettier --list-different '**/*.js'
```
(superfluous linebreaks included). In the case that a script fails, it
also omits the giant “this is most likely not a problem with npm” block.
The downside to this is that it suppresses any errors in npm-run-script
itself. For instance, `npm run wat` produces “missing script: wat”,
while `npm run --silent wat` just silently exits with 1. This does not
silence the actual scripts themselves, so things like lint errors or
test failures will still appear.
Test Plan:
Run `yarn travis` before and after this commit, and note that the
resulting build log is prettier after.
wchargin-branch: ci-silent
Summary:
This CI script accomplishes two tasks:
1. It speeds up our build by parallelizing where possible.
2. It opens the possibility for running Travis cron jobs.
Currently, this script by default does the same amount of work as our
current CI script. However, I’d like to move `yarn backend` into the
list of basic actions: a backend build failure should fail CI.
Note: this script is written to be executable directly by Node, so we
can’t use Flow types with the standard syntax. Instead, we use the
comment syntax: https://flow.org/en/docs/types/comments/
Test Plan:
The following should pass with useful output:
- `npm run travis`
- `GITHUB_TOKEN="your_github_token" npm run travis -- --full`
The following should fail with useful output:
- `npm run travis -- --full` (fail)
To test different failure modes, it can be helpful to add
```js
{id: "doomed", cmd: ["false"], deps: []},
{id: "orphan", cmd: ["whoami"], deps: ["who", "are", "you"]},
```
to the list of `basicTasks` in `travis.js`.
To test performance:
```shell
$ time node ./config/travis.js >/dev/null 2>/dev/null
real 0m8.306s
user 0m20.336s
sys 0m1.364s
$ time bash -c \
> 'npm run check-pretty && npm run lint && npm run flow && CI=1 npm run test' \
> >/dev/null 2>/dev/null
real 0m12.427s
user 0m13.752s
sys 0m0.804s
```
A 50% savings is not bad at all—and the raw time saved should only
improve from here on, as the individual steps start taking more time.
wchargin-branch: custom-ci
Summary:
This can be useful for speed, but it can also be important for
correctness (at least theoretically): if we run both these scripts
concurrently, then we don’t want one of them to squash the `bin`
directory while the other is about to invoke an executable therein.
One might note that the diffs to the two files in this commit are
virtually identical, and indeed the files themselves are quite similar.
I’d prefer to keep the duplication for now; if we really need a Bash
snapshot testing framework, we can factor one out.
Test Plan:
Run each script with `--help`, with `--build` and `--no-build`, and with
and without `-u`.
wchargin-branch: optional-rebuild
Summary:
To avoid confusion, we simultaneously remove unused capturing groups.
This is not strictly necessary, but it makes the code less brittle.
Test Plan:
The newly added test fails before the change to `findReferences.js`.
wchargin-branch: url-punctuation
This commit adds the `addReferenceEdges()` method to the GitHub parser,
which examines all of the posts in the parsed graph and adds References
edges when it detects references between posts. As an example, `Hey
@wchargin, take a look at #1337` would generate two references.
We currently parse the following kinds of references:
- Numeric references to issues/PRs.
- Explicit in-repository url references (to any entity)
- @-author references
We do not parse:
- Cross-repository urls
- Cross-repository shortform (e.g. `sourcecred/sourcecred#100`)
`Parser.parse` calls `addReferenceEdges()`, so no change is required by
consumers to have reference edges added to their graphs.
The GitHub porcelain API layer now includes methods for retreiving the
entities referenced by a post.
Test plan:
This commit is tested both via snapshot tests, and explicit testing at
api layer. (Actually, the creation of the porcelain API layer was
prompted by wanting a cleaner way to test this commit.) I recommend
inspecting the snapshot tests for sanity, but mostly relying on the
tested behavior in api.test.js.
I added a lot of new comments that have url references to different
types of GitHub entities, e.g. to pull request review comments.
The commit was generated by running the example repo fetcher, and
running yarn test and updating snapshots.
Summary:
Flow didn’t catch this because all the payloads are `{}` anyway.
Test Plan:
Note that every node and edge payload is now listed exactly once in the
correct spot for each of `{Node,Edge}{Type,Payload}`.
wchargin-branch: git-nodepayload
This method removes `Graph.inEdges` and `Graph.outEdges`. As a
replacement to these two functions, `graph.neighborhood` now takes an
optional `direction` flag, which can be set to `"IN" | "OUT" | "ANY"`.
This reduces the surface area of the Graph API, and means that the same
pattern can be used when requesting in or out neighbors as is used when
requesting all neighbors.
This change generates significant churn in the test files, and in some
cases the tests are less elegant / show historicity, as they were
written for the type signature of `{in,out}Edges`, which just returns an
array of edges, and now receive an array of neighbors. I think this is
acceptable, and it's not worth re-writing the test.
In many cases, replacing existing calls to `{in,out}Edges` in our actual
codebase resulted in cleaner code, as `neighborhood` successfully
abstracts over the common patterns that users of `{in,out}Edges` were
implementing.
As a fly-by refactor, I also changed the `neighborAddress` part of the
`neighborhood` return value to `neighbor`. It's a little less
descriptive, but it's more concise, and flow is there to help ensure
it's used correctly.
Test plan: Note that CI passes. Inspect the test changes, and verify
that they are appropriate transformations for consuming the new API.
Test Plan:
This snapshot test is too unwieldy to actually read—it’s 1000 lines of
opaque SHAs and thrice-stringified JSON objects—so it should be
interpreted as a regression test only. The programmatic tests should
suffice.
wchargin-branch: wip-git-create-graph
Test Plan:
Run `yarn lint` and `yarn travis` and observe success. Add something
that triggers a lint warning, like `const zzz = 3;`; re-run and observe
failures.
wchargin-branch: lint
In general, methods in the porcelain GitHub api may return multiple
types; e.g. a reference could be to an Issue, PullRequest, Comment,
Author (or more). To make working with the api more convenient while
maintaining safety, this commit adds a static `asType` method to each
Entity class, which confirms that type coercion is safe, and errors if
not.
This commit also adds `issueOrPRByNumber`, a convenience method, to
api.test.js.
Test plan: Check the API usage and verify that it is reasonable.
Interacting with raw contribution graphs is cumbersome. We'll need
more fluent and convenient ways to retrieve data from them; we can do
this by creating porcelain APIs that wrap the underlying graph.
This commit adds a simple porcelain API for the GitHub data. It creates
the following classes:
* `api.Repository`
* `api.Issue`
* `api.PullRequest`
* `api.Comment`
* `api.Author`
The classes all wrap a graph and a nodeAddress. They provide read-only
functions for retreiving data from the graph; that data might be a part
of the node payload, or it might do some graph traversal under the hood.
The choice to have the wrapper hold onto the Address rather than the
node itself was deliberate; in the future, the graph may contain nodes
that are not synchronously reachable, so this approach allows us to
create wrappers for nodes we can't synchronously reach. When this comes
up in practice, we can then add async methods to the wrapper.
Note that some data already included in our graph, such as
PullRequestReviews and PullRequestReviewComments, were deliberately
excluded, so as to allow the core ideas to be reviewed without
unnecessary clutter.
Test plan:
Check that the unit tests appropriately test the behavior, and that the
API seems pleasant to use.
Summary:
Two reasons for this. First, we want tests to be able to operate on this
data without having to generate repositories via `git(1)`. (Doing that
is slow, and requires a Git installation, and makes it less clear that
the tests are correctly isolated/provides more surface area for
something to go wrong.) Second, in general plugins will need a canonical
source of test data, so setting/continuing this precedent is a good
thing.
Test Plan:
Observe that the old Jest snapshot must be equivalent to the new JSON
one, because the test criterion in `loadRepository.test.js` changed and
the test still passes. Then, run `loadRepositoryTest.sh` and note that
it passes; change the `example-git.json` file and note that the test
fails when re-run; then, run the test with `--updateSnapshot` and watch
it magically revert your changes.
wchargin-branch: check-in-git-repo
Summary:
I’d like to use `Map`s whenever the keys are homogeneous (i.e.,
dictionaries, not structs). But this has proven infeasible. The primary
issue at this point is that `JSON.stringify(anyMap)` is `"{}"`—not
entirely unreasonable given that maps can have non-string keys, but
frustrating enough to not use them.
Test Plan:
Jest appears to order the snapshot keys differently for `Map`s and
objects (the former by insertion order and the latter alphabetical),
which makes the snapshot change harder to read. I verified that the
general structure is okay, and hand-verified some of the individual
changes. Noting that the number of lines added and deleted in the
snapshot is a good sanity check.
wchargin-branch: map-to-object
When requesting nodes and edges from the graph, it is convenient to
filter them by their type.
In the future, we should add plugin filtering as well, as we
expect type names to collide across plugins.
We may also want to consider keeping a cache of nodes and edges by type
to speed up these calls, if they become performance bottlenecks. (The
implementation in this commit naively iterates over every node/edge.)
Test plan:
Verify that the unit tests are appropriate.
Currently, we store GitHub Users, Organizations, and Bots as separate
nodetypes in the graph. This is inconvenient, as we don't care very much
what type of entity authored a node.
This commit collapses those three categories into one nodetype. The
extra information has migrated to the node payload, so it is still
possible to discover this information if it's important.
Test plan: There is some amount of snapshot churn because the author
node types and payloads have changed. Verify that the snapshot changes
are appropriate, and that CI passes.
This commit renames the following graph functions:
* `get{Node,Edge}{,s}` -> `{node,edge}{,s}`
* `get{In,Out}Edges` -> `{in,out}Edges`
* `getNeighborhood` -> `neighborhood`
The rename was effected across the repo by running:
```
$ find src -name "*.js" -exec sed -i 's/getNeighborhood/neighborhood/g' {} +
```
modified appropriately for each subsitution.
Test plan:
Inspect the code to make sure nothing was erronously renamed. Check that
CI passes.
`Graph.getAdjacentEdges` had a serious defect: for the adjacent edges,
it's hard to tell which of the {src,dst} is the neighboring node address
and which is the node we called `getAdjacentEdges` on.
This commit fixes that limitation by replacing `getAdjacentEdges` with
`getNeighborhood`, with a return signature of
`{edge: Edge<EP>, neighborAddress: Address}[]`
Some yak shaving was required: we needed a version of `expectSameSorted`
and, by extension, `sortedByAddress` that takes an accessor to an
Addressable, so that we could test that the neighborhoods returned were
correct. To satisfy flow, I created `expectSameSortedAccessorized` and
`sortedByAddressAccessorized`. Cumbersome, but it worked. ¯\_(ツ)_/¯