Posts

, ,

Spotlight on Sphinx: Python Docs For Everyone

If you’ve ever looked at the Python documentation, you’ve seen Sphinx in action.

Sphinx is an open-source project that allows people to automatically generate static websites for Python documentation. Besides code-heavy documentation, it can also be used as a static site generator.

What is Sphinx?

Sphinx is a Python project that takes in reStructuredText and outputs static web pages. It is most commonly used to host documentation. With Sphinx installed, you can write comments in your code similar to how you would with JavaDoc, and it will pull in all those comments to provide a big picture of your functions and classes. This can be extremely helpful as a programming reference, and since it pulls directly from the code, you don’t have to worry about it getting out of sync.

Who’s using it?

Sphinx was originally created to host the official Python documentation, but it’s only grown from there. Many popular libraries use Sphinx to host their documentation, and it’s become something of an industry standard among Python developers.

In addition, the popular documentation site Read The Docs makes this process even easier by allowing developers to host and update their Sphinx docs by connecting the repository and building the docs just like you would code. This “docs-as-code” approach helps ensure maximum compatibility between the code and the documentation, and helps mitigate documentation debt.

Here are some notable companies or libraries using Sphinx to host their websites or documentation:

It doesn’t stop at just documentation. Some people have written their personal sites, courses, or even whole books using Sphinx:

How can Sphinx help me?

Now that you know about all the great things Sphinx can do, I bet you’re wondering how you can use it in your work. Sphinx is adaptable enough to work for many use cases, but it really shines at documenting code, Python code in particular.

If you’re writing Python software as part of your job and having trouble maintaining the docs (or God forbid, you don’t have any docs!), Sphinx is definitely worth a try. It’s free, open source, and there are a variety of resources and tutorials out there to help you customize it to your needs.

Sphinx is great when you have structured information. In this way, Sphinx might not be such a great choice if you’re trying to host your latest novel, but it is a good idea for technical manuals with a complex table of contents that people will need to navigate. Another great feature of Sphinx is that it comes with search built-in, so you don’t have to worry about pulling in another package to do the heavy lifting for you.

Set up your first Sphinx site

Ready to get started? Let’s go through the basics of installing and setting up your first Sphinx site.

You can install Sphinx from PyPI (Python Package Index) by running this command:

$ pip install Sphinx

Once Sphinx is installed, run sphinx-quickstart from your project folder to initialize a project there:

$ sphinx-quickstart
    1. Sphinx-quickstart will go through and ask you a bunch of questions. This sets up some initial configuration values, but you can always go back and change them later. For most projects the defaults will suffice. Be sure to enter your project’s name and the project author when prompted.
    2. If you’re wanting to generate docs from your Python code, be sure to enable the autodoc extension (disabled by default).
    3. When sphinx-quickstart is finished running, you should have several new files and folders used to configure and manage your site. If you need to change any of the configuration values in the future, you can do that in conf.py.
    4. The Makefile is what allows us to build the documentation and package it into HTML for the web. To build the example skeleton project, run:
$ make html

The output files should be in _build/html. Navigate there now:

$ cd _build/html

The home page for our site is index.html. Open that file in a web browser to see the example project:

$ open index.html

You should see the basic layout of your new Sphinx site.

sphinx site

Congratulations! You have Sphinx up and running.

For next steps on how to add posts and customize Sphinx, I recommend Brandon’s Sphinx Tutorial (PDF). It’s both informative and easy to follow.

Now that you know about Sphinx, go out there and Write The Docs!

, , ,

Continuous Integration: A View into appendTo’s Process

Recently appendTo has been leveling up many of our internal processes, procedures, and standards. This has been a long time coming, but has been given dedicated attention recently. One of these efforts, the focus of this article, is appendTo’s most recent upgrade to our continuous integration and deployment (CI/CD) procedure and systems. This is one big piece of a larger standards and practices effort that is ongoing within our organization, and which benefits both our customers as well as our staff. If you would like to read more about our general developer workflow, standard practices, and tools you should read Aaron Bushnell’s recent article titled “The Tools We Use“!

Unique Requirements

There are some unique challenges to solve in a professional services organization when it comes to project server management. Each client, and each project for those clients, inevitably has unique software requirements. One client may have a back end in Rails which requires a specific application and testing stack while another may use Node.js with MongoDB. Furthermore, we could have one client which needs to use Node.js 0.10.28 because 0.10.29 has some change that breaks their application. These widely varying requirements result in appendTo not being able to simply deploy all client applications to a single staging server, we have to have unique environments.

Unfortunately, we don’t typically have the freedom to assign a server administrator to each project, and even if we could, that would not be the best use of our time, and thus our client’s investment. The solution that appendTo has implemented balances the needs of our clients with ease of use for our developer staff. The system is highly automated, but allows for extensive customization of the entire build process by the developers through simple JSON configuration within the project repository. We use TeamCity for version control triggers, build step management, access control, and reporting and logging. On the other side of the process we use Digital Ocean (and their HTTP API) to dynamically generate virtual machines (droplets) when needed and host and serve application files.

Bridging the Gap

The biggest problem we had with TeamCity was the ability to integrate, in the way we needed to, with the Digital Ocean API and individual droplets. We needed to create new droplets on the fly, allow each application to configure them during provisioning, deploy the application files, test them, and of course serve them up for our clients to see. TeamCity is written in Java, and while it may have been nice to integrate directly with that piece of this puzzle, we decided to create a solution which abstracts us one layer from that tool. That drove our engineers to create a bridge application which accepts commands (in this case from TeamCity) and executes code in our deployment system (Digital Ocean droplets) via a set of Node.js scripts and libraries. The benefit is that if either TeamCity or Digital Ocean need to be replaced in our stack, our unique build and deployment process can be salvaged and reused in the new stack. Of course, one of our greatest strengths at appendTo is our deep knowledge of JavaScript, which made Node a natural choice for this bridge.

So what does this all look like? The diagram below identifies the big pieces, on the left we have Github, where all of our application code resides. TeamCity, on the server in the center of the diagram, will identify code changes in a repository and initiate a build process. The build process will send various commands, in a particular order, to the Node application which in turn interacts with the Digital Ocean HTTP API and directly with the droplets over ssh. The droplets are provisioned when necessary (usually on first run) from a standard image that gives projects access to commonly needed software (like Node). Finally, the Node scripts report the results of each action to TeamCity in order to create a single source for build logs and history.

ci-cd process diagram

Wrap It Up and Put a Bow On It

This setup works very well for our needs, but we wanted to make things easier on our developers. To that end we made each step of the build process configurable within the project itself. Many of our projects use Grunt to run tests, concatenate and minify files, and generally build a project for distribution. In order allow for the simplest integration of that process, the specific actions required for each build step are specified in the “scripts” block of the project’s package.json file (a Node artifact).

Here is what that might look like:

{
    "name": "my-project",
    ...,
    "scripts": {
        "provision": "apt-get install mongodb",
        "install": "bower install",
        "test": "grunt test",
        "stop": "forever stopall",
        "deploy": "grunt deploy",
        "start": "forever server/main.js"
    }
}

Each of the lines in this “scripts” block maps to a step (or part of a step) in the larger build process. They will all be executed within the code directory, meaning developers don’t need to worry about where the code is located and can focus on what needs to be done to make it work. Additionally, since each line is executed (essentially) on the command line over ssh, the developer could specify a shell script should their process be overly complex for a one-line statement. Note that using Grunt is not at all required, but this process simplifies using tools like it with continuous integration and deployment.

Sustainable Process

As we discussed earlier, the goal with this bridge application written in Node is portability. Should we need to transition to a new tool on either side of that diagram above, we can reuse what we have written and our existing projects will require no alterations. This makes our developers happy, our clients happy, and results in better solutions delivered faster.

By implementing better standards and practices, and properly vetting those through our development staff, appendTo is able to accomplish two simultaneous goals: rapidly deliver better solutions to our clients and make the lives of our developers easier. The first may seem obvious, better standards (and adherence to them) directly translates into cleaner, more manageable code. The second may not be as obvious, but it is just as important to us. We value our developers’ time as well as their opinions. That’s why we vet our standards through them. Any employee is able to submit a change to our standards and practices. This change will be discussed among the entire organization and ultimately is approved by an S&P board which is composed of upper level engineers.

, , , ,

Getting to know Bower: a package manager for the web

How long does it typically take you to identify and download all of the third-party libraries and dependencies that you want to use for a front-end web development project? Minutes…hours? Maybe even days? It can be a time-consuming and tedious process identifying and downloading all of the libraries you want to use in addition to all of their dependencies. Package and dependency management is an area where package managers can really save us a lot of time and headaches.

You might be familiar with some other package managers. If you’re a Linux user you’re probably familiar with apt-get or RPM. If you’re a .NET developer you’ve probably at least heard of NuGet if you’re not using it. Node developers know and love NPM. But does such a thing exist for front-end web development?

Yes it does.

Bower is a package manager for the web, and it can dramatically simplify the process of dependency management for your front-end web development projects. In addition to that, it can also make it a lot easier for package developers to share their packages with others. These two reasons alone make it a valuable tool to have in your developer toolbox.

Below is a presentation I put together that will walk you through how to use Bower from start to finish. It covers everything from installing it, using it to manage dependencies in your front-end projects to creating and maintaining your own package for distribution through Bower. If you’ve been curious about whether or not Bower is the right tool for you to use, this should give you the information you need to make that decision and get started using it.


https://presentboldly.com/ryexley/getting-to-know-bower