Last week I introduced the proof of concept for my FarmCraft project. As an extension to that, I thought I'd start a developer log in relation to the work that I'm doing and have completed.

This will give me the opportunity to keep those interested in the project up to date, and allow me to share my thoughts and decisions as I go. I feel like keeping a log will also help with accountability and drive on my end. You can think of it like a weekly dev AAR (After Action Report).

Over time, the scope of these logs will ultimately go outside of FarmCraft, since I'll have other projects I'm working on. I would like to keep to a specific format however, and have decided on the following components:

  • Time
  • A breakdown of how my time was spent
  • Tasks
  • An overview of the work that was completed during the week's sprint
  • Successes
  • Things that went well
  • Struggles
  • Troubles that I encountered, or things that could have gone better
  • Final Thoughts
  • A general reflection on the past week


As you can see below, the majority of my time was spent on the User service this week. I worked on getting some tests and basic functionality built out, in addition to interacting with Azure AD B2C.



As far as tasks for this sprint, things were going well initially, until I realized I had two different ideas / targets for how FarmCraft should be built and function (explanation below). At that point, I had to add some additional tasks that I didn't quite get to.




This week I had my first real attempt at some test driven development. I'm pretty sure my tests can use some improvement and may not be following best practices, but I went through and created some tests prior to developing any code, which was quite helpful in figuring out method signatures and what I really needed / wanted to do.

Akka Remote

In the past, I've worked with in the context of a single system. I was able to expand this to have a core actor system running as a background service while utilizing a second actor system to communicate over TCP. In essence, I had 3 different actor systems:

  1. The core actor system running as a background service
  2. An actor system on local azure functions that forwards requests to the "core"
  3. An actor system on a web API that forwards request to the "core"


Azure B2C Subscriptions

My first struggle of the week came with Azure B2C subscriptions. My hope was to use the Microsoft Graph SDK to subscribe to user changes within Azure AD to handle user creations and modifications. I successfully created subscriptions, but never received notifications after changing user details in Azure AD from the Azure Portal, or from registering a new user.

I had seen that webhooks didn't work with B2C, which is why I pursued a the graph notifications, but after spending a couple hours on it, I eventually found a doc saying that user subscriptions weren't supported in Azure B2C or personal accounts, which is what I was using 😥

Requirement Conflicts

About half way through the week, as I was working though all of my user tests, I came to the realization that I had two conflicting requirements:

  1. An easy to use, SaaS platform that you can just sign up for online to get started
  2. A simple to install, on-prem version that could be managed without internet connectivity if required

In reality, I don't think these are so much a "requirements" conflict as a difference in target audience. On one hand (#1), you individuals in an urban environment with good internet access that want something they don't have to manage. On the other hand (2#) you have individuals in rural areas that may or may not have a decent internet connection that still need things to operate effectively. This could also include individuals who are more tech or privacy focused and don't want all of their information going to a third party.

Ultimately, I think this results in two different products. For #1, I see a scalable SaaS solution, as I originally intended using Azure, while for #2 I see a smaller, "community" version that is easy to install or setup locally.

Akka & Azure Functions

For the SaaS version of FarmCraft, one of the requirements would be to integrate with Azure AD B2C. This means that when someone registers with the B2C tenant, I would utilize an API Connector prior to sending a token that checks whether or not a user exists in my database and create it if needed.

This is easy to do by creating an Azure Function that connects to the database. Rather than making additional database connections and having logic split between my core actor system and azure functions however, I wanted to have the Azure Function utilize Akka Remote to make a request to the main Actor System.

The issue is that by default, Azure App Services only allow communication through ports 80 and 443. This means that There's no port available to utilize with Akka. In order to fix this, I would need to start up a VM or ASE and expose additional ports that way.

Final Thoughts

Overall, I'd consider this week mediocre. I did have some successes in relation to Akka Remote and utilizing TDD, but it was overshadowed by issues using Akka Remote on an Azure Function and determining whether I want to focus on the SaaS or on-prem version of FarmCraft.

Taking into account that this is currently a side project and I'm trying to limit expenses as much as possible, I think I'll shift focus to the local version for the time being. Utilizing Azure would provide scalability and additional income if it takes off, but it will also end up costing a fair amount as I develop everything.

  • Azure VMs (One per microservice that can expose ports for Akka): Minimum $25.00 each / month
  • Azure Kubernetes may be an option?: $160.00 / month
  • Azure CosmosDb (one per microservice): $23.00 each / month
  • Azure API Management (To hide the backend services): $50.00 / month
  • Azure Service Bus: $10.00 / month
  • Azure IoT Hub: $0.00 / month

On the bright side, I'll be able to gain more experience with technologies such as

  • RabbitMQ
  • Docker
  • Docker-Compose
  • Postgres