THE SECURITY PS BLOG:
Observations and insights from the Security PS Team.

ASP.NET Core, PostgreSQL, Docker, and Continuous Integration with Jenkins

Following the Kansas City Developer Conference and the release of ASP.NET Core 1.0, I decided to try out the new framework, to deploy infrastructure with my application, and to use a continuous integration server. This post summarizes what I did and the result of that effort; however, I want to stress that this is in no way a recommendation of how one should securely build and deploy an application. A lot of these technologies are brand new to me, and my goal was just to get them to work. But, instead of waiting until everything is perfect, I wanted to write about what I had so far in case it helps someone else.

First, a list of the technologies I used and how far I took them:
  • ASP.NET Core with ASP.NET MVC 6 - Deploy and run the default template with Entity Framework and ASP.NET Identity (including ability to register and login) on Linux using Kestrel and NOT IIS or SQL Server
  • PostgreSQL - Used as my database instead of the more traditional choice of SQL Server
  • Docker - Used to host Linux containers for my ASP.NET MVC 6 applications, PostgreSQL database, and HA Proxy load balancer. Also allows me to deploy the infrastructure with the application
  • HA Proxy - Used as a load balancer
  • Jenkins - Used as my continuous integration server to automatically build and deploy the application AND its infrastructure
ASP.NET Core with MVC 6
With Visual Studio 2015 completely up to date, I started with File -> New Project -> .NET Core -> ASP.NET Core Web Application. In the next dialog, I chose "Web Application" and for authentication, I chose "Individual User Account". I made sure nothing was checked for Azure. Next, I modified the Main method of Program.cs to ensure Kestrel would listen on ALL interfaces instead of just localhost. To do that, I added .UseUrls("http://0.0.0.0:5000") as shown below.

In order to use the PostgreSQL database, I uninstalled the Microsoft.EntityFrameworkCore.SqlServer and Microsoft.EntityFrameworkCore.SqlServer.Design packages. Then, I installed Microsoft.EntityFrameworkCore.Tools.Core (use -Pre for Install-Package), Microsoft.EntityFrameworkCore.Tools (use -Pre for Install-Package),  and Npgsql.EntityFrameworkCore.PostgreSQL. Here are my packages afterward:


Once the project has support for PostgreSQL in Entity Framework, the ConnectionString needs to be updated and EntityFramework needs to be configured to use PostgreSQL. So, I modified the default ConnectionString in the appsettings.json file as shown below.


Then, I found the existing services.AddDbContext related code in the ConfigureServices method in Startup.cs, and I modified it to use PostgreSQL. The code is shown below.



Docker (For Windows)
Yes, I'm using Docker on Windows. That means I can use Visual Studio AND I can deploy to a Linux container. Later, I can run it on a production system with a Linux without having to change a single thing. So, using windows works great for my purposes. You could also do this on Mac OS X, your favorite Linux distro, etc. but you would need to use Visual Studio Code as your IDE instead. To set up Docker, I went to their website, downloaded Docker for Windows and installed it.

PostgreSQL (Using Docker)
For my database instance, I used a Linux Docker container to host PostgreSQL. I used the "official" image found at https://hub.docker.com/_/postgres/. That official image allows for a script to be run at start up to get the database set up the way you want. The script must be named "init-user-db.sh" and must be placed in the /docker-entrypoint-initdb.d/ directory of the Docker container. My script creates a database and a user for accessing that database.

 Next, I set up the DockerFile for that container. It looks like this:

I can now build and run the container:


Apply Entity Framework Migration
Now that the database is running, I applied the Entity Framework Migrations so the Login and Register features will work. To do that, I used the command line and changed directory to the root of the application (the place where project.json is located) and ran "dotnet ef database update".



Run the Application
Finally, I ran the application with "dotnet run", and visited the site in a browser at http://localhost:5000/. I can now register a new user and login. All of that data is being stored in the PostgreSQL Docker container's database.


Hosting The ASP.NET Core MVC6 Template in a Docker Container
After verifying I could run the application and connect to the database, I wanted to be able to host the application itself in a docker container. Microsoft has some ready-made Docker images (https://hub.docker.com/r/microsoft/dotnet/) that I used to accomplish it. My DockerFile file follows their instructions for copying the application's root into the container, running "dotnet restore" to download all the required Nuget packages, and "dotnet run" to start the application.


Here's how I built and ran that Docker container:
 

Finally, I visited http://localhost:5000 to test that the application is running.

HAProxy
This is the first time I've used HAProxy. Basically, I saw that other people had used it and I thought I would try it. To get it to work, I found an example configuration file, made a casual attempt to understand some of the settings in HAProxy's documentation, and then just messed with it until it worked. I don't recommend doing it this way for a real application. I was very happy when it finally forwarded my traffic correctly!

One of the key challenges I had was that I couldn't point HAProxy's configuration at localhost:5000 and localhost:5001 for two exposed instances of my ASP.NET Core MVC application. Since I wasn't familiar with Docker, it took a while to figure out what to do. I eventually learned that you can "link" containers together and that some environment variables are automatically populated to help refer to those instances. Also, HAProxy can use environment variables in the configuration file. As a result, here's the haproxy.cfg configuration I came up with:


In the configuration, you can see some references to using a cookie to ensure the same client hits the same server each time. I never actually got that part to work.

My DockerFile for HAProxy can be found below. Again, I used an official image (https://hub.docker.com/_/haproxy/) and followed their instructions.


Here are the commands I ran to bring up the web servers, the load balancer, and link them together.




Next, I visited http://localhost:80 to see that everything worked.

Jenkins
I used Jenkins throughout this process as a continuous integration server. There's a whole lot I don't know how to do correctly with Jenkins, but I will share the build script I used. I have separate batch files for each step of the build process to ensure each return code is evaluated and the build process will fail if any of them return anything other than 0. Here's a screenshot of the Jenkins configuration for running those batch files:

And here are those build scripts:


I don't think this one is actually necessary. I think I was experimenting with publishing, but I'm going to include it here just in case.








Conclusions
There's a lot to be improved upon, but for now, I don't have time to work on it further. One thing I really wanted to get working was to "publish" the application instead of simply running the code within the Docker container. It would be nice to deploy the full application without needing to redownload all the Nuget packages every time. I was able to get a basic version of this working, but then I ran into an issue in which my views weren't actually being updated after I modified them. I wasn't able to pursue it more to troubleshoot that issue.



    Blogger Comment
    Facebook Comment

0 comments:

Post a Comment