architecture

Dockerizing a NodeJS App

In this post I am documenting what steps I made to convert a traditional NodeJS App that is launched from a command line using node app.js Into a fully dockerized container solution. The App uses a MySQL database which it has static configuration for. I am not going into too much details about the App’s code or architecture but it is just worth noting that it has this piece of configuration for connecting to the database;

Learn Faster

Carrying on with Jeff Patton’s “User Story Mapping” for my project, after slicing releases it is time to put a framework in place to learn faster Once you have your product idea and asked yourself who are your customers, how they will use it and why the need it, you need to validate the problem your product will solve really exists. Find a handful of people from your target market and try to engage them.

Build Less

Carrying on with Jeff Patton’s “User Story Mapping” for my project, following on from Framing the Big Picture the next step is to plan on building less, because There’s always more to build than you have people, time and money for Story mapping helps big groups build shared understanding, if the product has stories that crosses multiple teams’ domains get all the teams together so that you can map for a product release across all of the teams, this will help visualize the dependencies across the teams.

Continous Deployments

Continuous Deployments is the next stage of automation following on from it’s predecessors continuous integration (CI) and continuous delivery (CD). The integration phase of the project used to be the most painful step, depending on the size of the project developers work on isolated teams dedicated to seperate components of the application for a very long time, when the time came to integrate those components a lot of issues, like unmet dependencies, interfaces that don’t communicate etc, are dealt with for the first time - the idea of CI was thought out to combat this problem.

The Big Picture

Reading on Jeff Patton’s “User Story Mapping” I have been applying the ideas in a small project I am working on - an online grocery shopping service gengeni.com. In this post I am documenting focusing on the big picture. Jeff insists on creating documents which promotes a shared understanding through user stories (rather than the traditional requirements, which are prone to mis interpretations). He insists that we are building software not for the sake of it but to make things better, solve real world problems, therefore we should focus on maximising the outcome (how we make things better) while minimizing the output (software components).

Service Discovery and Proxying

Delivery of software as microservices running on immutable and self-sufficient containers is a very sobust method and has gained a lot of popularity in the recent years. Containers usually expose tyhe microservice as a web service acccessible through a certain port number on the host. Because host machines are able to run many conatiners and the fact that these containers need to be started and shut down quickly and easily without any side effects, it is not really feasible for consumers of these web services to point to manually assigned hosts and ports.

Containers

Virtual Machines In the quest for maximising efficiency of computing power available on servers Virtual Machines (VMs) came into existence, with products from firms like VMware and Virtualbox pushing the concept to general users. “In computing, virtualization refers to the act of creating a virtual (rather than actual) version of something, including virtual computer hardware platforms, operating systems, storage devices, and computer resources.” - Wikipedia Virtual Machines are created on top of hypervisors which run on top of the host machine’s operating system (OS), the hypervisors allow emulation of hardrware like CPU, Disk, Memory, Network etc and server machines can be configured to create a pool of emulated hardware resources available for applications in the process making the actual harware resources on those server utilized much more efficiently.

Decoupling API Versions From Codebase Versions

When developing a package(any piece of reusable code, like a class library to be loaded or a web service that’s accessible through HTTP) that has a published API it is necessary to have a clear separation between the API version and the codebase version of the package. The API is what is exposed from the package for the users to consume, this should be documented clearly and the module should have thorough tests included that tests the entire published API to assert it conforms to the documented.