Sunday, October 15, 2017

Thoughts on AngularMix 2017

I've been very busy with side projects lately, so it's been a long time since I've taken time to write anything up.

However, I just got back from AngularMix this past week, which was a great experience and was definitely worth a post.  A lot of interesting things are coming out of the Angular world, and I'll share my thoughts on a couple of the highlights (for me).  There were a lot of great talks, but I think my favorite part was the opportunity to meet and talk developers, including some of the core Angular team members.


Angular Elements


One thing that hasn't been as easy in Angular as in AngularJS is just being able to add an Angular component to an existing page.  There's a lot of use cases here, such as having a mix of technologies (React and Angular), or just wanting to start converting functionality from existing server side rendered pages by adding an Angular component.

Rob Wormald gave a talk here on a new enhancements to Angular they are calling "Elements", and had some really nice demos of the early pre-release functionality.  Elements basically allows any component (or even entire application) to run as a standalone custom html that can self-bootstrap into any html page.  He showed an example where was easily able to add independent copies of an Angular component to a static html page, and manipulate it by setting properties on the component.  This is really a HUGE step forward for inter-operability of Angular components with other technologies.

Nx from Nrwl.IO


Nrwl was one of the main sponsors of Angular Mix, and Victor Savkin gave a couple talks here.  One interesting talk was regarding applying a common set of terminologies and message-based design principles in NGRX.  Some of the content is from a post he previously wrote here, which is definitely worth a read.

However, the bigger story for me was the release of Nx, a suite of tools integrated with the CLI for enhancing Angular applications.  There are tools for simplifying generating NGRX code, and also a set of functionlity for supporting breaking up your Angular applications into libraries.  As long as you use a monorepo for your Angular code-base, the Nx tools make it easy to refactor your code into reusable modules.  Until now, it's really not been easy to work with code as libraries in context of CLI-based workflow, so this is a huge step forward.

Developers inherit the Internet


One thing that really amazes me is how web technologies are driven forward by really small groups (or even individuals) who have a passion for improving things, and making lives better for developers.  People make a big deal out of Javascript framework fatigue, but to me this is a symptom of a healthy, thriving community, and seeing the speakers at Angular Mix really drove home the point to me.  With the open source movement, combined with sites like GitHub that  provide ability to share code to anyone in the world for free, pretty much anyone with development skills can make a contribution.  What an amazing time we live in =)

Monday, May 1, 2017

First look at Angular Material 2 Data Table

For anyone working with Angular 2, there's a number of options out there in the data table space, but noticeably lacking has been support for one in the Material 2 project.  If you're looking for good support of material styling, there's only a few options that I've found:

1. MDL - just a styled MDL table, or angularized version.
2. ngx-datatable
3. ng2-md-datatable

These are decent options depending on your needs, but if you've been wanting official version, we've been out of luck till now.  It looks like we finally are getting close to release DataTable in Material 2. Work has been visibly in progress for a while (and people have been begging for it!) over at the official tracking issue here: https://github.com/angular/material2/issues/581

Right now it's still sitting in a fork of the official repo, but I took opportunity to pull the base files out to take a look and get it running in a simple project.

So...for anyone who wants to play with the data table in the context of an Angular CLI based project, I kind of shoehorned the files from the branch into a minimal CLI project, and put the table on the AppComponent page. You can add it to any CLI project by basically following setup done in this base project.

Code is here: https://kmkatsma.github.io/md-table-cli-demo/

Running demo is here: https://kmkatsma.github.io/md-table-cli-demo/

One thing I can't figure out is how to get the headers to show without loading data in the table.  But overall, it seems to be pretty good approach.  I like that it just uses css for sizing and formatting, seems like that's good way to not have to build it into the api of the grid itself.  In any case, excited to finally get a chance to see the work being done!


Saturday, February 25, 2017

Using Docker for Asp.NET Core projects on Windows 10

Recently, I've been trying to get familiar with Docker containers for deployment scenarios, in both Go and .NET Core.  One thing I really wanted was an environment that that supported Docker in both Windows and Linux.  Fortunately, Windows 10 for Docker now supports this.

In order to use Docker for Windows, you must be running Windows 10 Professional.  This is required because Docker for Windows requires Hypervisor, which does not run in Windows 10 Home.

After the Docker for Windows installation, Docker will try to turn Hypervisor on for you, but note that even after upgrading, this might not work.  It did not for me.  Fortunately, I found a great document for resolving Hypervisor problems here at petri.com in case you have issues (one of the rare times StackOverflow didn't immediately answer my problem!).

So after I installed Docker into my windows 10 by following the steps documented at Docker site (they have done great job documenting, I think), I installed the Visual Studio 2017 RC, which has the latest .NET SDK tooling.  I then created a new NET Core Web API using the new project wizard, and checked the option for docker.  For reference, that option is circled below:



By selecting this option, the VS docker tooling automatically adds a Dockerfile and a docker-compose component to the project, which can be used to build and run the project in a docker container.   This is pretty cool, but one thing I don't like is that using the Visual Studio tooling obfuscates what is actually going on under the covers.

In order to get a better feel for what actually is going on in Docker, I switched to using command line to work with docker, and found a decent tutorial here at stormpath.com.  I still had some issues despite following this, and what follows is brief overview of how I got it all working.

First, in my project, I used the following Dockerfile, which I set to "Copy Always" for output option, so the Dockerfile is deployed as part of the "dotnet publish" command.

FROM microsoft/aspnetcore:1.0
COPY . /app
WORKDIR /app
EXPOSE 5000/tcp
ENV ASPNETCORE_ENVIRONMENT Development
ENV ASPNETCORE_URLS http://*:5000
ENTRYPOINT ["dotnet", "Holidays.dll"]

The first line FROM microsoft/aspnetcore:1.0 pulls in an optimized Linux runtime image for aspnetcore project.  The COPY command copies the current directory contents to app folder, and then WORKDIR is set to that /app folder.  I found some comments that using anything besides "/app" will not work, but didn't test.

EXPOSE is used to open up specific port from docker to the outside runtime environment.

The next two ENV lines are used to set runtime environment properties.  These are not required, but I included them as a form of documentation.  If you leave the URL entry out, the container will default to Production environment, and listen on port 80.   The URL entry should match the EXPOSE entry.  By overriding these I just made it clear where values are coming from.

Finally, the ENTRYPOINT starts up the dotnet process running in Kestrel web server, and launches the dll that you pass as second argument.  This should be the name of the dll created during compilation.

After setting up this docker file, then use the following steps at command line, at the root directory of the project you wish build create a container with.  I use Git Bash (some people prefer Powershell).

1. dotnet publish  
2. docker build bin/debug/netcoreapp1.0/publish -t holidays  (note: holidays is what the docker image is named).
3. docker run -d -p 80:5000 --name holidaytest holidays  (note: this creates container, and names is based on value after --name argument.  In this example holidaytest becomes the name of the container created from the image created in step 2, named holidays.  The -p 80:5000 maps the current machines port 80 to the internal port used in the container image set by dockerfile, which was 5000 in my example).

After these steps are complete, you can navigate to the URL for one of the web api controllers on localhost on your machine, and it will run.

To shut down, and then restart this container you can use the following:

docker stop holidaytest
docker start holidaytest

To get a list of images you have on your machine:

docker images

To get a list of containers on your machine, running or not, use:

docker ps -a

It's important to understand the difference between an image, which is just a basic recipe for the container, and the container itself, which is an instantiated version, with port mapping, etc.  I didn't quite get this for a while, and caused me some problems!