A little zip trouble

For some strange reason AWS Elastic Beanstalk was handing me the error ” Failed to unzip source bundle, abort deployment (Executor::NonZeroExitStatus)” when I uploaded a new application version.

This was for a zip file put together using either Powershell’s Compress-Archive, or by using [IO.Compression.ZipFile]::CreateFromDirectory. The end result of either was an archive that Windows and 7zip were both perfectly happy to open, but AWS was not. And yet creating a zip-file with the same contents using 7zip and uploading it to AWS worked fine. Unfortunately I was trying to script this in Powershell and using 7zip wasn’t a useful option.

Figuring AWS was using Linux I opened a successful and unsuccessful zip in Ubuntu and there was an immediate difference: the good zip had a sub-folder called publish-output; while the bad one has lots of publish-output\xyz entries in the root. Despite that, both happily extracted in Ubuntu. This .NET/Linux incompatibility lead to some more Googling and finally an answer that .NET breaks the zip rules for path separators which turned into a wonderful workaround in Powershell:

using System;
using System.Text;

public class ZipEncoding : UTF8Encoding
    public ZipEncoding() : base(true) {}
    public override byte[] GetBytes(string s)
        s = s.Replace("\\", "/");
        return base.GetBytes(s);

Add-Type -TypeDefinition $ZipEncoding
$encoder = New-Object -TypeName ZipEncoding

Add-Type -As System.IO.Compression.FileSystem 
[IO.Compression.ZipFile]::CreateFromDirectory((Resolve-Path $publishPath), $zipPath, "Optimal", $true, $encoder) # true creates a surrounding root folder

Aurelia, Bootstrap, and Sass

Today’s objective was to override Bootstrap variables to permit some theme customization. Overall the process had a few pitfalls and I found that many references to this combination of tools were now out of date, presumably the result of changes to Aurelia’s project structure since they were written.

Sass Preprocessor

I started with a default Aurelia setup. This supports CSS and doesn’t include a preprocessor for Sass. To help, I generated an Aurelia project using au new with the Sass preprocessor selected. This highlighted the first change needed which was to replace the cssProcessor in the aurelia.json file:

  "cssProcessor": {
    "id": "sass",
    "displayName": "Sass",
    "fileExtension": ".scss",
    "source": "src/**/*.scss"

To pre-process Sass a pre-processor is required. I used gulp-sass, obtained via npm install gulp-sass --save-dev. save-dev is used because this is a build tool and not required at runtime. To introduce this into the build process, I changed tasks/process-css.ts

import * as gulp from 'gulp';
import * as changedInPlace from 'gulp-changed-in-place';
import * as sourcemaps from 'gulp-sourcemaps';
import * as sass from 'gulp-sass';
import * as project from '../aurelia.json';
import {build} from 'aurelia-cli';

export default function processCSS() {
  return gulp.src(project.cssProcessor.source)
    .pipe(sass().on('error', sass.logError))

Then I added an .scss file and referenced it from the app.html with a .css extension: . At this point doing au run and opening the browser showed the scss styles applied.

Adding Bootstrap

Aurelia doesn’t create a physical CSS file, instead the build.bundle() call in tasks/process-css.ts adds it to the app-bundle.js. As the goal was to customize Bootstrap before the CSS is generated, then the generated CSS, and therefore Bootstrap, must be included in the app-bundle rather than the vendor-bundle (where the contact manager tutorial puts it). This meant removing jquery and Bootstrap from the vendor-bundle section of the aurelia.json and putting it in the app-bundle section as follows:

"bundles": [
        "name": "app-bundle.js",
        "source": [
        "dependencies": [
        "name": "vendor-bundle.js",

The json above is slightly out of order because it references bootstrap-sass, the Sass version of bootstrap. This was obtained using the command npm install bootstrap-sass --save. I also had to clear out the original Bootstrap by deleting it from the package.json and running npm prune, and then doing similar steps for Typings.
At this point the scss file was as follows:

$navbar-default-bg: #800;
@import '../node_modules/bootstrap-sass/assets/stylesheets/bootstrap';
div {
    border: 1px solid green;

Building this with au build resulted in fairly verbose and non-illuminating errors because in order to work Bootstrap depends on gulp-autoprefixer which needed to be added to the process-css.ts as follows:

import * as gulp from 'gulp';
import * as changedInPlace from 'gulp-changed-in-place';
import * as autoprefixer from 'gulp-autoprefixer';
import * as sourcemaps from 'gulp-sourcemaps';
import * as sass from 'gulp-sass';
import * as project from '../aurelia.json';
import {build} from 'aurelia-cli';

export default function processCSS() {
  return gulp.src(project.cssProcessor.source)
    .pipe(sass().on('error', sass.logError))

With that, I successfully overwrote a Bootstrap variable.


While SQL Server is well supported in F# through the built in type provider FSharp.Data.SqlClient, SQL Server itself has the drawback of cost, which can be difficult to justify in a small business or an environment where its richer features are not required. The type provider I’ve used thus far, SQLProvider, supports a wider array of database engines, and this post explores using it with MySQL.

Data Migration

MySQL Workbench includes tooling to migrate from another database. To access this, open the Database menu and select Migration Wizard…. To migrate follow the Migration Task List noting the following:

  1. In Source Selection use Database System = Microsoft SQL Server, Connection Method = ODBC (native)
  2. In Manual Editing select Column Mappings from the View dropdown (top-right):
    • change the target types to suit MySQL. For instance, where Source Type is UNIQUEIDENTIFIER, set Target Type to CHAR(36).
    • remove the UNIQUE target flag from columns which are not primary keys.

Code Changes

After completing migration and verifying the creation of the tables and copying of data, the calling code needs to change to support MySQL:

let resPath = __SOURCE_DIRECTORY__ + @"../../packages/MySql.Data/lib/net45"

type Sql = SqlDataProvider<
            ResolutionPath = resPath,

Unfortunately when I did this I kept getting an error “Unable to resolve assemblies. One of MySql.Data.dll … must exist in the paths …”. Debugging errors with type providers is difficult as they run at build. In this case I directly created a MySQLCommand by typing let x = MySql.Data.MySqlClient.MySqlCommand which gave more useful errors: “The type X is required here and unavailable. You must add a reference to System.Y…”, thus adding those missing references helped. At some stage I also needed to restart Visual Studio after correctly setting the reference path for the MySQL dll.

Changing the database engine changes the objects generated by the type provider so some find-and-replace is required to correct these. There are also some type changes, such as changing bytes to signed bytes.


The first, and frankly show-stopping, problem is with Guids. MySql.Data interprets CHAR(36) as a System.Guid (it can instead use BINARY(16) via a connection string property: OldGuids). However SqlProvider created a string property and the end result is a runtime conversion error between the two within SqlProvider. My impression, although questions to that effect have not been answered, is this is a limitation of SqlProvider. Likely the best solution is to use CHAR(37) instead of CHAR(36) and then add a lot of manual string to/from Guid conversion.

A question I’ve not yet answered is whether this works on Mac and Linux. In theory I believe it should as the MySQL client dll can run against Mono as described here. However given the poor GUID support my inclination is to try a different database engine at this stage.

Running on Linux

As the F# Foundation states: “F# is a mature, open source, cross-platform, functional-first programming language”. Today I decided to try out the cross-platform part of that statement by trying to get my project running on Linux.

I’m running Ubuntu 16.10 64-bit with 3GB of RAM in VirtualBox. The steps below come from various sources, which are referenced by links.

F# Setup on Linux

Steps taken from http://fsharp.org/use/linux/ and sites it references. All steps were run in a Terminal in Ubuntu.

Step 1. Add mono to apt sources

sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
echo "deb http://download.mono-project.com/repo/debian wheezy main" | sudo tee /etc/apt/sources.list.d/mono-xamarin.list
sudo apt-get update

Step 2. Install Mono

sudo apt-get install mono-devel
sudo apt-get install mono-complete 

At this point I was able to create and run the hello.cs program as described here meaning Mono and therefore .NET was functioning on the machine.

Step 3. Finally, install fsharp

sudo apt-get install fsharp

Setting up the project

To get the project running we first needed the source code

git clone -b Suave https://github.com/winterlimelight/FunctionalDomainProject.git FunctionalDomainProject
cd FunctionalDomainProject

and then to restore the libraries using paket. This came with a slight hiccup as linux needed to be told that the bootstrapper and paket were executables by using chmod.

chmod a+x .paket/paket.bootstrapper.exe
chmod a+x .paket/paket.exe
.paket/paket.exe update

At this point I tried my first compile but got a series of errors rooted at the Store.fs. This is the file containing the SqlDataProvider connection string, and in order for F# to compile it needed to be able to connect to that database. This required the connection string to change to reference the IP address of the VirtualBox host machine, and to replace Trusted_Connection=true with User Id=...;Password=.... The host machine needed the above SQL login created and given dbo rights to the AssetManager database. It also needed a firewall exclusion added for SQL Server.

With those changes in place, the following command performed a successful compile:

xbuild AmApi/AmApi.fsproj

To run it, the executable needed execute rights, then could be called:

cd AmApi/bin/Debug/
chmod a+x AmApi.exe

The integration tests have been hosted in Postman, a Chrome extension, so it was a simple matter to install that extension in Chrome in Ubuntu and open the test collection and run it. The results: 22/22 passed.

Playing in the environment

Beyond this I also tried to get debugging going using VS Code with Ionide. I found some information about possible configuration steps necessary for F# debugging, but couldn’t get it working myself.

I also decided to create a FAKE build by creating an empty project using the Ionide FAKE plugin in VS Code. This created the necessary build.* files which were copied into the project repository, and a paket dependency for FAKE created. The outcome of that can be seen here.

Dependency Injection

My project architecture has been setting up to allow dependency injection. For instance, the commands take repository instances as arguments. But the approach I’ve been preparing is very object-oriented, and in my notes I had mused on thoughts about how partial application would be a more functional way of doing this. However exactly how to structure this has eluded me. Thankfully the ever understandable F# for Fun and Profit just created a post Functional approaches to dependency injection that bridges this gap, so now I’m going to walk through my conversion from interfaces to function types.

Interfaces to Function Types

The repositories the simplest place to start. At present the interface for the template repositories is:

type ITemplateReadRepository =
    abstract member FindById: System.Guid -> Template option

type ITemplateWriteRepository =
    abstract member FindById: System.Guid -> Template option
    abstract member Save: Template -> unit

Changing these to function types means replacing each member with a function type.

I firmly believe in read-write separation so it’s important that there is a distinction made between finds made by the write system for the purpose of identity and validation, and finds made by a read system for querying. So despite having identical signatures, I like the concept of different types for FindById.

type FindTemplateById = System.Guid -> Template option
type FindTemplateByIdForValidation = System.Guid -> Template option
type SaveTemplate = Template -> unit

In a large project these would likely be separated into different modules purely for the purpose of code organization.

Passing partials

The current implementations of these methods directly instantiate a data context, which means they create a dependency, which is what we’re trying to avoid.

member this.FindById (id:Guid) = 
    let dc = Sql.GetDataContext()
    // use dc to find template

In object-oriented dependency injection the domain class would have a dependency on some IRepository and the IoC container would create a repository instance passing in the connection information. In functional programming this option is not available, so the dependencies have to be passed as function arguments meaning we need a method with this signature:

DbContext -> Guid -> Template option
// for example, the persistence method:
let findTemplateById (dc:DbContext) (id:Guid) = ...

However this means the caller has to know how to create the DbContext dependency. That is likely not the responsibility of the caller, so we need another abstraction that manages the dependencies and only requires caller to provide variables that it is responsible for. We can do this by providing a function which can convert between the signature understood by the caller and the signature of the target method.

// Persistence method: DbContext -> Guid -> Template option
let FindTemplateById dc id = ...  

// Domain method: (Guid -> Template option) -> Guid -> Template option 
let GetTemplate findById id = 
    findById id

// Caller
let dc = Sql.GetDataContext()
let findTemplateByIdWithDc = FindTemplateById dc // Signature Converter
let res = GetTemplate findTemplateByIdWithDc id

The converting function, findTemplateByIdWithDc, is a partially applied function of FindTemplateById because we have not specified all of the arguments, leaving the second (id) to be set when findById is called.

In my project the DbContext instance is created in the program.fs which is a layer higher than the caller function (my Api level) above. This same pattern can be applied so that the DbContext is passed transparently through the Api level as well as the Domain. For the sake of organization, all of these ‘signature converters’ are placed into a file given the name CompositionRoot. That file is defined immediately before the first file that uses it, in this case before program.fs. The end result looks something like the following, which is a snapshot of the full stack used for the GET template request.

type FindTemplateById = System.Guid -> Template option // domain/persistence 
type IGetTemplate = System.Guid -> Template option // api/domain

// Persistence.fs
module Persistence.TemplateReadRepo =
    let findById dc id = 
        // use dc ...

// (Domain)Operations/Template.fs
module Operations.Template =
    let GetTemplate (findById:FindTemplateById) id = 
        findById id

// Api/TemplateController.fs
module Api.Template =
    let getTemplate (getTemplateById:IGetTemplate) (id:Guid) : WebPart =
        match (getTemplateById id) with ...

// CompositionRoot.fs
module Operations =
    let getTemplate dc = AmApi.Operations.Template.GetTemplate (Persistence.TemplateReadRepo.findById dc)

module ApiMethods =
    let getTemplate dc = Api.Template.getTemplate (Operations.getTemplate dc)

// Program.fs
let route dc =
    choose [
        pathScan ... (ApiMethods.getTemplate dc)

The composition root creates partial functions like Operations.getTemplate dc which mean that the argument given to Api.Template.getTemplate still conforms to the signature it requires while the information about the context travels to the domain, and in a similar fashion to the persistence call where it is finally used.