Elijah's Blog

Lessons learned from building a “simple” Django application
webpack ✊

The Django web application framework is a very popular web framework for Python, it was released in 2005 and is currently at version 2.2. It was the framework I built my first website with in 2013. It is a fantastic and fully featured web application framework, including what some consider the best ORM. 

My friends and I had an idea for a website where we could build and showcase our cool visualization ideas. I thought, “oh yeah, that should just be a simple Django app”. I didn’t want to deal with webpack or react, and I wanted to just upload all my static files to S3 with Cloudflare in front of it, to keep things "simple".

I decided to use Docker with AWS ECS to deploy the app and allow for easy scalability. The continuous deployment would use CodePipeline and look something like this:

When a pull request is merged into master, this pipeline would kick off and finally send a message to slack. Building and running the application locally worked great, but when I started trying to deploy the static files I ran into tons of issues. 

I had two main requirements when starting this project, I wanted to be able to use SCSS and ES6 without webpack or React. In the past, I’ve used django-pipeline to handle compilation of SCSS but ES6 was a new adventure. In addition to django-pipeline, there is a package called django-storages to help with managing files in S3, or any other cloud provider. At Carta, we used both of these packages, but not in tandem. We wrote our own “collectstatic” method that used django-pipeline to compile the files locally then run another command that used django-storages to upload them to S3. I didn’t want to do this, so I decided I would figure out how to make django-pipeline and django-storage work together.

Getting SCSS to work with django-pipeline is straightforward as there are scss binaries available on most platforms. (We’ll come back to getting it work with docker in a bit.) However, I also wanted to be able to write ES6 code and have it transpiled to regular Javascript. I didn’t realize how difficult this would be.

In order to transpile ES6 to Javascript, your only real option is to use Babel. This is what you use when running your React project, and it should be simple to install to work as a standalone transpiler too. Unfortunately, Babel recently went through a change to how their packages and presets are structured. They used to be babel-preset-env and are now babel/preset-env so you need to be careful when reading docs and copy/pasting code. Presets are required for Babel, otherwise, your code won’t be transpiled. After much research, I determined that I should just use babel/preset-env . An interesting thing about Babel is that when you specify presets you have to leave off the “preset” part. This means my command ended up looking like: babel --presets @babel/env which still doesn’t make a whole lot of sense to me. Since I didn’t want to use webpack, this meant I didn’t have a package.json file. I thought I could just install Babel locally with npm install but it wasn’t consistent across Docker and my local computer.

My Dockerfile ended up including these lines to “properly” install Babel.

RUN npm install @babel/[email protected] @babel/[email protected] @babel/[email protected]
RUN npm install -g @babel/[email protected] @babel/[email protected] [email protected]

This, however, wasn’t enough to get Babel working. I was able to compile my static files locally, but when I pushed them up to CI they wouldn’t compile. Babel was complaining about not seeing the preset-env preset. I thought Docker made code across environments the same! In most cases this is true, and after 8 hours (not exaggerating) of debugging I found out that I needed to add node_modules:/code/node_modules to my docker-compose.yml . This is so that when your Docker image is built, the node_modules are “saved” in a volume that can then be used later when running your container with docker compose.

After figuring this out, I was ecstatic and foolishly hit “rebuild” at 12 am thinking the build would succeed, it didn’t.

I started getting lots of weird errors when trying to compile and upload my SCSS/CSS to S3. Again, locally it worked fine. I started getting errors about files not existing like: OSError: file does not exist css/nocloud/main.css. This was strange because I did have a file called css/nocloud/main.scss which should have been compiled to main.css. This lead me down many rabbit holes and outdated pull requests with both django-pipeline and django-storages. It also lead me to a very sad comment that said:

most people have moved on to javascript bundlers like webpack or whatever, this has little use now.

I thought to myself, “sure, lots of people use webpack, but that shouldn’t be a good enough reason to just ignore a helpful pull request!”. Since that issue was still outstanding and the pull requests trying to fix it and other issues haven’t been merged, I ended up having to use two monkey patch solutions which still don’t fully solve the problem. In one of my apps I have:

class CoreAppConfig(AppConfig):
    name = 'nocloud.core'

    def ready(self):
        from pipeline.packager import Packager
        from django.contrib.staticfiles.finders import get_finders

        def __monkey_compile(self, paths, force=False, **kwargs):
            paths = self.compiler.compile(paths, force=force)
            for path in paths:
                if not self.storage.exists(path):
                    if self.verbose:
                        print("Compiled file '%s' cannot be found with packager's storage. Locating it." % path)

                    source_storage = self.find_source_storage(path)
                    if source_storage is not None:
                        with source_storage.open(path) as source_file:
                            if self.verbose:
                                print("Saving: %s" % path)
                            self.storage.save(path, source_file)
                        raise IOError("File does not exist: %s" % path)
            return paths

        def __monkey_find_source_storage(self, path, **kwargs):
            for finder in get_finders():
                for short_path, storage in finder.list(''):
                    if short_path == path:
                        if self.verbose:
                            print("Found storage: %s" % str(self.storage))
                        return storage
            return None

        Packager.compile = __monkey_compile
        Packager.find_source_storage = __monkey_find_source_storage

And then in my storage, I have:

import os
from tempfile import SpooledTemporaryFile

from django.contrib.staticfiles.storage import ManifestFilesMixin
from pipeline.storage import PipelineMixin
from storages.backends.s3boto3 import S3Boto3Storage
from storages.utils import setting

class BotoMixin(S3Boto3Storage):
    default_acl = 'public-read'

    def _save_content(self, obj, content, parameters):
        We create a clone of the content file as when this is passed to boto3 it wrongly closes
        the file upon upload where as the storage backend expects it to still be open
        # Seek our content back to the start
        content.seek(0, os.SEEK_SET)

        # Create a temporary file that will write to disk after a specified size
        content_autoclose = SpooledTemporaryFile()

        # Write our original content into our copy that will be closed by boto3

        # Upload the object which will auto close the content_autoclose instance
        super()._save_content(obj, content_autoclose, parameters)

        # Cleanup if this is fixed upstream our duplicate should always close
        if not content_autoclose.closed:

class S3PipelineStaticFilesStorage(PipelineMixin, ManifestFilesMixin, BotoMixin):
    bucket_name = setting('STATIC_FILE_BUCKET_NAME')
    custom_domain = setting('STATIC_FILE_CUSTOM_DOMAIN')

class S3PipelineMediaStorage(BotoMixin):
    bucket_name = setting('MEDIA_BUCKET_NAME')
    custom_domain = setting('MEDIA_CUSTOM_DOMAIN')

The problem is that this still doesn’t fully solve my original OSError problem. Whenever I create a new django-pipeline css bundle, I have to first create an empty file with the same name in S3, then collect my static files.

Looking back, I now see why we built our own solution at Carta to get around this problem. Trying to get these libraries to work out of the box with S3 is not possible at this point. What really gets me is that there are people trying to fix this problem, yet no one seems to care, because “just use webpack”. I think webpack and React have their place, but I don’t understand why it’s not possible to build a web application without them. After all these workarounds we finally have our website up at https://theno.cloud/ but it’s still not a seamless deployment process when I push something into master.

There are a couple things I’m curious about here:

  1. Do other web frameworks like Rails have this same problem?
  2. Am I just wrong to try to use modern tools like SCSS and ES6 with only server-side rendering in Django?

The last thing I learned from this adventure is that in 2018, you don’t really have any options when wanting to transpile ES6 — it’s Babel or nothing. Similarly, I couldn’t find any alternative to django-pipeline that provided all the features I needed like SCSS and ES6 compilation.

SCSS sidebar… unfortunately, there aren’t any precompiled binaries for libsass on ubuntu. This means if you want to compile SCSS using Docker, you have to compile it yourself. Which means adding about 10 lines and 5 minutes to your Docker builds.

ENV SASS_BINARY=/usr/bin/sassc

RUN git clone https://github.com/sass/sassc /sassc
RUN cd /sassc && git checkout $SASSC_VERSION
RUN git clone https://github.com/sass/libsass /libsass
RUN cd /libsass && git checkout $LIBSASS_VERSION
RUN cd /sassc && SASS_LIBSASS_PATH=/libsass make
RUN mv /sassc/bin/sassc $SASS_BINARY

RUN rm -rf /sassc

In all honesty, I may end up rewriting the frontend using React now, because this is such a pain to deal with. It makes me sad though that I can’t write a simple web application just using Django anymore unless I don’t use ES6 or SCSS. Maybe that’s just the way things will be from now on in 2019.

Note: this was written a few months ago (around August/September 2018), and our website https://theno.cloud isn't being actively developed at the moment. The current site there is just a single HTML file and image uploaded to an S3 bucket.