While SQL Server is well supported in F# through the built in type provider FSharp.Data.SqlClient, SQL Server itself has the drawback of cost, which can be difficult to justify in a small business or an environment where its richer features are not required. The type provider I’ve used thus far, SQLProvider, supports a wider array of database engines, and this post explores using it with MySQL.

Data Migration

MySQL Workbench includes tooling to migrate from another database. To access this, open the Database menu and select Migration Wizard…. To migrate follow the Migration Task List noting the following:

  1. In Source Selection use Database System = Microsoft SQL Server, Connection Method = ODBC (native)
  2. In Manual Editing select Column Mappings from the View dropdown (top-right):
    • change the target types to suit MySQL. For instance, where Source Type is UNIQUEIDENTIFIER, set Target Type to CHAR(36).
    • remove the UNIQUE target flag from columns which are not primary keys.

Code Changes

After completing migration and verifying the creation of the tables and copying of data, the calling code needs to change to support MySQL:

let resPath = __SOURCE_DIRECTORY__ + @"../../packages/MySql.Data/lib/net45"

type Sql = SqlDataProvider<
            ResolutionPath = resPath,

Unfortunately when I did this I kept getting an error “Unable to resolve assemblies. One of MySql.Data.dll … must exist in the paths …”. Debugging errors with type providers is difficult as they run at build. In this case I directly created a MySQLCommand by typing let x = MySql.Data.MySqlClient.MySqlCommand which gave more useful errors: “The type X is required here and unavailable. You must add a reference to System.Y…”, thus adding those missing references helped. At some stage I also needed to restart Visual Studio after correctly setting the reference path for the MySQL dll.

Changing the database engine changes the objects generated by the type provider so some find-and-replace is required to correct these. There are also some type changes, such as changing bytes to signed bytes.


The first, and frankly show-stopping, problem is with Guids. MySql.Data interprets CHAR(36) as a System.Guid (it can instead use BINARY(16) via a connection string property: OldGuids). However SqlProvider created a string property and the end result is a runtime conversion error between the two within SqlProvider. My impression, although questions to that effect have not been answered, is this is a limitation of SqlProvider. Likely the best solution is to use CHAR(37) instead of CHAR(36) and then add a lot of manual string to/from Guid conversion.

A question I’ve not yet answered is whether this works on Mac and Linux. In theory I believe it should as the MySQL client dll can run against Mono as described here. However given the poor GUID support my inclination is to try a different database engine at this stage.

Running on Linux

As the F# Foundation states: “F# is a mature, open source, cross-platform, functional-first programming language”. Today I decided to try out the cross-platform part of that statement by trying to get my project running on Linux.

I’m running Ubuntu 16.10 64-bit with 3GB of RAM in VirtualBox. The steps below come from various sources, which are referenced by links.

F# Setup on Linux

Steps taken from and sites it references. All steps were run in a Terminal in Ubuntu.

Step 1. Add mono to apt sources

sudo apt-key adv --keyserver hkp:// --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
echo "deb wheezy main" | sudo tee /etc/apt/sources.list.d/mono-xamarin.list
sudo apt-get update

Step 2. Install Mono

sudo apt-get install mono-devel
sudo apt-get install mono-complete 

At this point I was able to create and run the hello.cs program as described here meaning Mono and therefore .NET was functioning on the machine.

Step 3. Finally, install fsharp

sudo apt-get install fsharp

Setting up the project

To get the project running we first needed the source code

git clone -b Suave FunctionalDomainProject
cd FunctionalDomainProject

and then to restore the libraries using paket. This came with a slight hiccup as linux needed to be told that the bootstrapper and paket were executables by using chmod.

chmod a+x .paket/paket.bootstrapper.exe
chmod a+x .paket/paket.exe
.paket/paket.exe update

At this point I tried my first compile but got a series of errors rooted at the Store.fs. This is the file containing the SqlDataProvider connection string, and in order for F# to compile it needed to be able to connect to that database. This required the connection string to change to reference the IP address of the VirtualBox host machine, and to replace Trusted_Connection=true with User Id=...;Password=.... The host machine needed the above SQL login created and given dbo rights to the AssetManager database. It also needed a firewall exclusion added for SQL Server.

With those changes in place, the following command performed a successful compile:

xbuild AmApi/AmApi.fsproj

To run it, the executable needed execute rights, then could be called:

cd AmApi/bin/Debug/
chmod a+x AmApi.exe

The integration tests have been hosted in Postman, a Chrome extension, so it was a simple matter to install that extension in Chrome in Ubuntu and open the test collection and run it. The results: 22/22 passed.

Playing in the environment

Beyond this I also tried to get debugging going using VS Code with Ionide. I found some information about possible configuration steps necessary for F# debugging, but couldn’t get it working myself.

I also decided to create a FAKE build by creating an empty project using the Ionide FAKE plugin in VS Code. This created the necessary build.* files which were copied into the project repository, and a paket dependency for FAKE created. The outcome of that can be seen here.

Dependency Injection

My project architecture has been setting up to allow dependency injection. For instance, the commands take repository instances as arguments. But the approach I’ve been preparing is very object-oriented, and in my notes I had mused on thoughts about how partial application would be a more functional way of doing this. However exactly how to structure this has eluded me. Thankfully the ever understandable F# for Fun and Profit just created a post Functional approaches to dependency injection that bridges this gap, so now I’m going to walk through my conversion from interfaces to function types.

Interfaces to Function Types

The repositories the simplest place to start. At present the interface for the template repositories is:

type ITemplateReadRepository =
    abstract member FindById: System.Guid -> Template option

type ITemplateWriteRepository =
    abstract member FindById: System.Guid -> Template option
    abstract member Save: Template -> unit

Changing these to function types means replacing each member with a function type.

I firmly believe in read-write separation so it’s important that there is a distinction made between finds made by the write system for the purpose of identity and validation, and finds made by a read system for querying. So despite having identical signatures, I like the concept of different types for FindById.

type FindTemplateById = System.Guid -> Template option
type FindTemplateByIdForValidation = System.Guid -> Template option
type SaveTemplate = Template -> unit

In a large project these would likely be separated into different modules purely for the purpose of code organization.

Passing partials

The current implementations of these methods directly instantiate a data context, which means they create a dependency, which is what we’re trying to avoid.

member this.FindById (id:Guid) = 
    let dc = Sql.GetDataContext()
    // use dc to find template

In object-oriented dependency injection the domain class would have a dependency on some IRepository and the IoC container would create a repository instance passing in the connection information. In functional programming this option is not available, so the dependencies have to be passed as function arguments meaning we need a method with this signature:

DbContext -> Guid -> Template option
// for example, the persistence method:
let findTemplateById (dc:DbContext) (id:Guid) = ...

However this means the caller has to know how to create the DbContext dependency. That is likely not the responsibility of the caller, so we need another abstraction that manages the dependencies and only requires caller to provide variables that it is responsible for. We can do this by providing a function which can convert between the signature understood by the caller and the signature of the target method.

// Persistence method: DbContext -> Guid -> Template option
let FindTemplateById dc id = ...  

// Domain method: (Guid -> Template option) -> Guid -> Template option 
let GetTemplate findById id = 
    findById id

// Caller
let dc = Sql.GetDataContext()
let findTemplateByIdWithDc = FindTemplateById dc // Signature Converter
let res = GetTemplate findTemplateByIdWithDc id

The converting function, findTemplateByIdWithDc, is a partially applied function of FindTemplateById because we have not specified all of the arguments, leaving the second (id) to be set when findById is called.

In my project the DbContext instance is created in the program.fs which is a layer higher than the caller function (my Api level) above. This same pattern can be applied so that the DbContext is passed transparently through the Api level as well as the Domain. For the sake of organization, all of these ‘signature converters’ are placed into a file given the name CompositionRoot. That file is defined immediately before the first file that uses it, in this case before program.fs. The end result looks something like the following, which is a snapshot of the full stack used for the GET template request.

type FindTemplateById = System.Guid -> Template option // domain/persistence 
type IGetTemplate = System.Guid -> Template option // api/domain

// Persistence.fs
module Persistence.TemplateReadRepo =
    let findById dc id = 
        // use dc ...

// (Domain)Operations/Template.fs
module Operations.Template =
    let GetTemplate (findById:FindTemplateById) id = 
        findById id

// Api/TemplateController.fs
module Api.Template =
    let getTemplate (getTemplateById:IGetTemplate) (id:Guid) : WebPart =
        match (getTemplateById id) with ...

// CompositionRoot.fs
module Operations =
    let getTemplate dc = AmApi.Operations.Template.GetTemplate (Persistence.TemplateReadRepo.findById dc)

module ApiMethods =
    let getTemplate dc = Api.Template.getTemplate (Operations.getTemplate dc)

// Program.fs
let route dc =
    choose [
        pathScan ... (ApiMethods.getTemplate dc)

The composition root creates partial functions like Operations.getTemplate dc which mean that the argument given to Api.Template.getTemplate still conforms to the signature it requires while the information about the context travels to the domain, and in a similar fashion to the persistence call where it is finally used.


Having been some time with an in-memory mutable dictionary serving as a data store, the time seemed right to introduce a permanent data store.

Selecting a data store

Choosing a store for a solution is a key architectural decision and requires evaluating a huge set of factors including requirements, domain-store match, cost, scale, read vs write counts, eventual consistency vs availability, transactional integrity, etc.

There are a number of different stores that are reasonable for this project. Ideally I’d try implementing them all, however I don’t expect that to happen in the short-term. These are some options:

  • Document store e.g. MongoDB. This may be appropriate given our aggregates (template and asset) are quite complex objects.
  • Search index e.g. ElasticSearch. If there are many orders of magnitudes more reads than writes then optimizing for querying using a search index may be appropriate.
  • Event store. An event store, typically implemented manually on flat files, records all the business events triggered by API or system actions. It is often used in a CQRS system in concert with a read-specific store. Having read/write separation in our stack allows for this option.
  • Relational database e.g. SQL Server. Provides transactions, is the most mature, and the best supported by libraries.

Choosing between these relies on criteria that don’t exist because this isn’t a commercial project, so because I’m learning and trying to reduce the number of learning variables I chose the tried-and-true, a relational database.

As an aside, as an architect we also want to allow for future contingencies where it is practical. By providing a clear persistence layer in the code and comprehensive integration tests, we allow the choice of store to be re-evaluated in the future (providing the organization is prepared for the cost).


F# has an excellent feature, Type Providers, that generate strongly typed code at compile time for interacting with data sources. This page has a nice comparison of some type providers for F#. For no particular reason I selected SQLProvider

As mentioned earlier, the domain aggregates are relatively deep, so they map to multiple tables, which is why in a production solution I’d lean towards a non-relational store. Here is a comparison of the domain type template, and the tables that store it in SQL (constraints excluded):

type FieldValue = 
    | StringField of string
    | DateField of System.DateTime
    | NumericField of float

type FieldDefinition = {
    Id: System.Guid
    Name: string
    Field: FieldValue

type Template = {
    Id: System.Guid
    Name: string
    Fields: FieldDefinition list
    MaintenanceProgramId: System.Guid option
CREATE TABLE [dbo].[FieldValue](
    [FieldValueId] [uniqueidentifier] NOT NULL,
    [FieldDefinitionId] [uniqueidentifier] NOT NULL,
    [AssetId] [uniqueidentifier] NULL,
    [ValueType] [tinyint] NOT NULL,
    [StringValue] [nvarchar](max) NULL,
    [DateValue] [datetime] NULL,
    [NumericValue] [real] NULL)

CREATE TABLE [dbo].[FieldDefinition](
    [FieldDefinitionId] [uniqueidentifier] NOT NULL,
    [Name] [nvarchar](255) NOT NULL,
    [TemplateId] [uniqueidentifier] NOT NULL)

CREATE TABLE [dbo].[Template](
    [TemplateId] [uniqueidentifier] NOT NULL,
    [Name] [nvarchar](255) NOT NULL,
    [MaintenanceProgramId] [uniqueidentifier] NULL)

The end result is that we end up with some fairly complex queries and mapping logic. One of the sources of complexity is that the FieldValue table is used both for default template field values (when AssetId is null) and asset field values.

The query contains two left outer joins because a template may exist without any fields. This is shown using the rather cryptic (!!) operator. It would be nice if a more pleasant name could be used for this operator.

The mapping between discriminated unions and the underlying store can be handled by joining to a different table for each case, or by the approach used here which is to have a different column for each case. Using different columns makes the query easier, but results in a sparse table. Given we only have three cases the sparseness of the table shouldn’t be a big space penalty.

The read mapping is shown below. The full code is available here.

let private mapSingleTemplate (rows:TemplateQueryResultSet list) : Template =
    let fields = [ for row in rows do
                    let (_, defn, value) = row
                    if defn.FieldDefinitionId <> System.Guid.Empty then // empty guid means template has no fields
                        yield {
                            Id = defn.FieldDefinitionId
                            Name = defn.Name
                            Field = match value.ValueType with
                                    | 1uy -> StringField(value.StringValue.Value)
                                    | 2uy -> DateField(value.DateValue.Value)
                                    | 3uy -> NumericField(float value.NumericValue.Value)
                                    | _ -> failwith "Unknown field type"
    let (templateCols, _, _) = rows.Head
        Id = templateCols.TemplateId
        Name = templateCols.Name
        Fields = fields
        MaintenanceProgramId = templateCols.MaintenanceProgramId

let private templateByIdQuery (dc:DbContext) id : System.Linq.IQueryable =
    query { 
        for template in dc.Dbo.Template do
        // (!!) means left outer join
        for fieldDef in (!!) template.``dbo.FieldDefinition by TemplateId`` do
        for fieldVal in (!!) fieldDef.``dbo.FieldValue by FieldDefinitionId`` do
        where (template.TemplateId = id && fieldVal.AssetId.IsNone)
        select (template, fieldDef, fieldVal)

let private templateById id : Template option =
    let dc:DbContext = Sql.GetDataContext()
    let rows = templateByIdQuery dc id |> Seq.toList
    if [] = rows then None else Some (mapSingleTemplate rows)


Moving to Suave meant a new logger was required as the original logger used the .NET Core LoggerFactory. Generally for logging I’ve used log4net or flat files. In this case I decided to try using the System.Diagnostics.Trace tools – a little old fashioned perhaps – but supported in .NET 4.6.1 as well as .NET Core (future-proofing!).

Creating a trace involves two parts:
1. Creating a TraceSource instance and calling methods on it;
2. Adding listeners to the config file

In this case it is configured to write warnings and higher to the console, and everything to a file log (AmApi.log).

    <trace autoflush="true" />
      <source name="Log" switchValue="All" switchType="System.Diagnostics.SourceSwitch">
          <add name="console" type="System.Diagnostics.ConsoleTraceListener">
            <filter type="System.Diagnostics.EventTypeFilter" initializeData="Warning"/>
          <add name="logToFileListener"/>
          <remove name="Default"/>
      <add name="logToFileListener" type="System.Diagnostics.TextWriterTraceListener" initializeData="AmApi.log" />     

Suave has built in logging capabilities, such as the colored text displayed in the console. It allows these logs to be accessed by creating a logging adapter and configuring it to be used. This is described here, although the interface definition is more advanced than given on that page, as illustrated by the adapter shown below. This implementation calls a method, SuaveLog, on the main logger class that understands and converts Suave log levels to Diagnostics.TraceEventType.

type SuaveLoggerAdapter() =
    let _log (msg:Suave.Logging.Message) = 
        use strWriter = new System.IO.StringWriter()
        let txt = Suave.Logging.TextWriterTarget(Suave.Logging.LogLevel.Verbose, strWriter) :> Suave.Logging.Logger
        txt.logSimple msg
        Logger.SuaveLog (strWriter.ToString()) msg.level

    interface Suave.Logging.Logger with
        member __.logSimple msg = _log msg
        member __.log level msgFactory = _log (msgFactory level)
        member __.logWithAck level msgFactory = async { do _log (msgFactory level) }

Configuration is done in the startWebServer method. I wanted to preserve the existing logging capabilities, particularly the console, so a CombiningTarget was used. CombiningTarget sends the log messages to multiple loggers.

let defaultLog = Suave.Logging.Targets.create Suave.Logging.LogLevel.Info
let logger = Suave.Logging.CombiningTarget([ defaultLog; Util.SuaveLoggerAdapter() ])
let config = { defaultConfig with logger = logger }
startWebServer config handleRequest

To complete the picture the logging class and instance are shown here. I’ve been lazy and used the underscore here to denote the intention to keep the class private. Alternatively an interface could have been created and a private class defined to implement it. A singleton was also considered but one never knows when it might be useful to split logs so I try to avoid that approach.

type _Logger() = 
    let log = new System.Diagnostics.TraceSource("Log")

    let _log (eventType:Diagnostics.TraceEventType) (msg:string) =
        log.TraceEvent(eventType, 0, (sprintf "%s: %s" (DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss.fff K")) msg))

    override this.Finalize() = 

    member this.Info msg = _log Diagnostics.TraceEventType.Information msg
    member this.Warn msg = _log Diagnostics.TraceEventType.Warning msg
    member this.Error msg = _log Diagnostics.TraceEventType.Error msg

    member this.SuaveLog msg (level:Suave.Logging.LogLevel) = 
        let traceEventType = match level with
                                | Suave.Logging.LogLevel.Verbose -> Diagnostics.TraceEventType.Verbose
                                | Suave.Logging.LogLevel.Debug   -> Diagnostics.TraceEventType.Verbose
                                | Suave.Logging.LogLevel.Info    -> Diagnostics.TraceEventType.Information
                                | Suave.Logging.LogLevel.Warn    -> Diagnostics.TraceEventType.Warning
                                | Suave.Logging.LogLevel.Error   -> Diagnostics.TraceEventType.Error
                                | Suave.Logging.LogLevel.Fatal   -> Diagnostics.TraceEventType.Critical
        _log traceEventType msg

let Logger = _Logger()

Suave Gotcha

I had quite the frustrating hour with Suave trying to combine Webparts. In theory it is very simple:

Function1-that-returns-Webpart >=> Function2-that-returns-Webpart

In practice I kept getting an error “Expecting a type supporting the operator ‘>=>’ but given a function type. You may be missing an argument to a function” when trying to run this (thinned out) code:

let getTemplate : Webpart =
    OK "..." 
    >=> Writers.setMimeType "application/json; charset=utf-8"

let route = choose [ pathScan Path.Assets.templateById getTemplate ]
startWebServer config route

The problem – a missing open Suave.Operators – and without it F# didn’t know what >=> was. I guess it was trying to treat >=> as an inline function, but how it goes from there to that error message presently baffles me.


A Change of Direction

I’ve decided to step away from ASP.NET Core for the moment. I’d like to learn more about F# and good functional practice, and from what I’ve learned so far I feel this would be better achieved via tools and frameworks that were designed for F#. Stepping back also allows use of the .NET Framework and therefore my very much missed debugger.

Choosing a Web Framework

In looking at the options for a web framework for F#, two options stood out as being mature and well-supported: WebSharper; and Suave.IO. My choice is to use Suave.IO because I’m building an API and so am not in need of all the client capabilities that WebSharper seems to be strong in. To that end I’ve been through (i.e. typed out) the Suave Music Store tutorial and read through the documentation.

Picking up a new framework is never easy, and less so when the paradigm being used is new. To aid with that I’ve got the Suave source code from github so I can debug into functions to understand what is going on. There were a few problems referencing the linked files (e.g. Utils/Facade.fs) when building it, which seemed to mysteriously evaporate when the project was reopened. So I’m finally at the point where I can ‘port’ my existing code over the Suave.IO and see what happens. This is where it is at so far, but I hope to report more progress in a day or two:

let webPart =
    choose [
        path Path.Assets.template >=> choose [
                PUT >=> OK "Template PUT"
        pathScan Path.Assets.templateById (fun guid -> OK (sprintf "Get Template by Id: %s" guid))

        path Path.Assets.template >=> choose [
                PUT >=> OK "Asset PUT"
        pathScan Path.Assets.assetById (fun guid -> OK (sprintf "Get Asset by Id: %s" guid))
        NOT_FOUND "No handler found"

let main argv = 
    printfn "%A" argv

    let config = { defaultConfig with bindings = [ { scheme = HTTP; socketBinding = { ip = Net.IPAddress.Parse(""); port = 5000us }} ]}

    startWebServer config webPart
    0 // return an integer exit code