6

Keeping Your Documentation Up-to-Date with Bit and GitHub

 3 years ago
source link: https://blog.bitsrc.io/keeping-your-documentation-up-to-date-with-bit-and-github-9cc6e62906f8
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Keeping Your Documentation Up-to-Date with Bit and GitHub

Use the power of Bit and GitHub actions to automatically update the code you show on your documentation

Image for post
Image for post
Image by S. Hermann & F. Richter from Pixabay

Writing documentation for your code, whether it is a small library or a full set of microservices other developers need to interact with, is definitely one of the most tedious tasks in our industry.

The main issue with documenting a system is that it can very easily get outdated due to changes in the code. And although, we tend to favor automation in our tasks, updating these changes is a task that normally needs to be done by hand. Going snippet by snippet, making sure every example in our documentation is up-to-date. That is of course, until now because in this article I’m going to show you a workflow that will help you identify the moment a snippet gets outdated and needs to be checked.

So let’s get into it, shall we?!

The Workflow

The key behind this workflow is Bit’s GitHub bot, which takes care of creating a Pull Request for every repository currently using a component that gets updated.

Just to clarify, in case you’re now aware of Bit: Bit is a platform that enables developers to share their JavaScript components (we’re talking React, Vue, Angular, and even Node.js modules).

Bit components are best thought of as a super-set of standard node packages. In addition to their distributable code (the node package), a Bit component has the information needed for it to be independently maintained and developed (for example, its source code, documentation, version history, development environment setup configurations, and more.)

Image for post
Image for post
Exploring shared components on Bit.dev

Back to the topic at hand: this automated behavior allows component consumers to be notified the moment something changes. Let’s face it, one of the major problems with today’s JavaScript ecosystem is that it’s so damn fast, that libraries and frameworks get updated every day. And these updates sometimes break part of our code (or in fact, part of our documentation).

To avoid this, however, I’m going to present a workflow involving Bit’s bot, a Github Action, and a tool called Gist-It, which should allow us to automatically import code snippets from Github.

So you have a better idea of what we’re going to be covering here, our final workflow should look like this:

Image for post
Image for post

I know it looks like a lot, so let’s review it, you’ll see that there are no complicated steps here:

  1. First things first. The trigger for the entire workflow will be an update made to your component. We can safely do this thanks to the fact that Bit is aware of every repository using a component and in turn, this allows us to react to a component update, instead of actively checking if something changed.
  2. Thanks to this integration, Bit will then create a PR for every project using the updated component. In the case of projects having the code already integrated into the code base, this PR will update that code. However, for projects simply importing the component as a dependency, the PR will update the dependency version. Either way, a change will be created.
  3. And that is it for now. At this point, the workflow will stop, until you as the project maintainer, decide to merge the PR (step #3). Of course, Github will be able to tell you if there are any merge conflicts or not, so you can go about it however you want. The point here is that once you merge, the following will happen:
  4. A GitHub action will be triggered. This action will run a set of tests you’ve created for your code samples (we’ll get to this in a minute), and if any of them fails, it’ll execute step 5.
  5. A deprecation warning will be added at the top of the affected code snippet. Of course, this is the example action we’ll show here. If you’re familiar with Github Action, you’ll know you can add further steps, such as notifying over Slack or email that something’s not right. Thus letting you know which files require immediate attention to keep the documentation up-to-date.
  6. Finally, you’ll be including the code snippets automatically from GitHub. Why? Because that way your documentation will reflect that deprecation warning the second it’s added. This in turn will notify any user hoping to use your snippets for their own code.

Let’s now go over each step required for you to get this ball rolling.

Being the component’s author

If you happen to be the author of the component you’re using, you’ll have to have created an Organization inside Bit.dev, which is Free, so don’t worry about it. Once there, you’ll be able to add collections and components inside it.

Pick the component you want this to work for, and click on the Integrations button located at the top right corner of your Organization page.

Image for post
Image for post

After that, you’ll be shown all potential integrations, we’ll pick Github for this example.

Image for post
Image for post

Finally, you’ll get to pick the component you want this to work for, as well as the repositories (and their branches) you want to affect. For our example, we’ll pick “All”, since we want everyone using our components to, at least, be notified when we update it.

Image for post
Image for post

Click the “Add” button and the new integration will be listed just like in the image above.

That’s all you have to worry about as the author.

Setting up the right project structure

One of the benefits of automation, is that you don’t have to worry about manually doing a particular task. But for this to happen, you have to favor certain configurations or folder structures. And in our case, the same thing happens; we’ll need to follow certain rules for this automation to take place.

The folder structure

You can customize the following structure however you like, as long as you then update the rest of the examples to accommodate for that change.

However, the folder structure I went with for my sample project is the following:

Image for post
Image for post
  • The .github folder will be created for us by Github or we can add it manually, that’s not a problem. Here is where the workflow file for our Github action will live.
  • The components folder is where we’ll have our components “imported” from Bit. You can really have them anywhere, I’m just using this as an example.
  • The examples folder is where we’ll save our snippets and their corresponding test files (inside the examples/tests folder).
  • And finally, the last interesting bit, is the file called test-output-parser.js which will take the output of our tests, parse it and figure out which files to update based on the outcome.

Setting the tests for our snippets

I know what you’re thinking: who writes tests for their snippets? And I get it, if you only have 2 or 3 snippets, there is really no point for that. But then again, if you only have 3 snippets, then there is no point for you to be doing any of what I’m showing here.

However, if you’re building something massive, which has very rich documentation, you could be looking at tens or even hundreds of code snippets, showing all the potential ways of interacting with your project. And if that is the case, trust me, you’ll want to automatically check if they’re still valid or not after the component they depend on gets updated.

So, for each snippet you have, write at least one test that, and make sure they look like this:

A couple of points:

  1. I’m using mocha for my tests, if you’re using something else, you’ll need to adapt the rest of the examples as well.
  2. Notice how the nested describe is mentioning the name of the file where the snippet is coming from. This is required, because we need a way to understand which file this test is testing (and there is really no other way of doing this).

This should be enough to make sure our snippets are still valid.

Creating the Github Action

With our folder structure in place and our tests written, we need to make sure that once we merge a PR, we execute them, and parse their output.

Here is where GitHub comes into play. We can do this by going to our repository and clicking the Actions tab:

Image for post
Image for post

Click on the Create new workflow button and pick any of the templates available, we’ll replace their content in a second anyway.

Essentially, we want a workflow that gets triggered on a Push action to our main branch. This is because I’m assuming you’re not allowing developers to directly push to your main, instead only PRs will be created against it and the only actual Push will be done by Github when we merge it. Sadly there are no merge triggers right now, so we can’t really use that event here.

The content of our workflow file looks like this:

There is some boilerplate code in there, but the main things to look at are:

  • In line 16 we’re setting Node’s version to be 14. We can add a list of versions if we want our tests to run with different ones just in case.
  • Line 26 runs npm install which makes sure we have every dependency required for our project to run the tests. This is also important, because if your PR only updated the package’s version inside package.json, then this will install that version.
  • Line 27 runs the tests, parses the output and updates all needed files. This one line takes care of everything (we’ll take a look at the JS file in a minute).
  • Lines 28–40 take care of updating the branch, since we’ve potentially made changes to some files (the snippets that are deprecated), we’ll need to create a new commit (hence the commit message and the credentials shown in there).

This workflow will run every time you merge a PR. They may be PRs created by Bit or PRs created by other users. Either way, every time it runs, you’ll see an output like the following inside the logs screen:

Image for post
Image for post

Notice how in this particular scenario, the examples/sample1.js file got updated with the deprecation warning.

Also, in case you didn’t notice it, the command we’re using to run our tests is: ./node_modules/mocha/bin/mocha examples/tests -R json

We’re having to use the local version of mocha that gets installed as a dependency, and we’re using the JSON reporter, this will simplify the task of parsing the output. Which funny enough, is the next topic to cover.

Parsing the output of the tests

To make things simpler, we’re using Mocha’s JSON reporter, which gives all the information we need in a format that Node.js can easily interpret.

Our script will have to:

  1. Look for errors reported
  2. For each one, we’ll parse the message extracting the filename
  3. Open the file, and add the deprecation warning at the top (only if it doesn’t have it already).

And here is the code that takes care of that:

The script is very straightforward, it captures the JSON received as part of the argv array and parses it. If there is anything inside the errors element, then we’ll iterate over those elements and perform the tasks I already mentioned.

Nothing else, the content of the files will be updated only if their corresponding tests failed and if they don’t already have the warning inside them (this is to make sure we don’t keep adding warning after warning with subsequent PRs).

How do we know if the workflow changed anything?

The easiest way to check for this, is to look at your repository’s list of files. If the workflow was executed and some tests failed, then it should’ve made the changes and saved them with the following message: “ci: Automatic Deprecation Warning added”

So just look for that in the list of files, like this:

Image for post
Image for post

As I mentioned before, you can extend this workflow even further and add the ability to notify you through the channels you need (i.e Slack, email or whatever you use).

Using the code snippets directly from Github

The final step we’ve not talked about yet, is the one about including the code snippets directly from GitHub. This will cause your documentation to be automatically updated with the deprecation warning the moment it is added.

And because you also might want to keep documentation and implementation as two separate projects, this tool might be the best fit for your needs: Gist-it

All you have to do to include the following script tag wherever you want your snippet to appear:

<script src="http://gist-it.appspot.com/URL-TO-SNIPPET-FILE"></script>

So, the following page:

Gets rendered like this:

Image for post
Image for post

You don’t really have to worry about the snippet, it will automatically update with any new change published to Github and as you can see, it already comes with syntax highlighting. Notice the first line of the code: the deprecation warning is already in place.

Conclusion

Through the course of this article, you’ve created a workflow that not only allows you to stay up-to-date with any changes done to your dependencies, but that also allows you to have a great night sleep knowing the snippets you use in your documentation, are either updated properly or at least, they’ve been marked as deprecated and you have them easily identified.

This is no small feat, and the fact that you’ve done it with a few simple configuration files and a small script is definitely a win-win in my book.

It’s time for you to try it now and please, let me know down in the comments if you’ve tried the workflow and made any improvements. I’d love to know!

Learn More


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK