Thanks to online feedback, the development of the monitoring dashboard is proceeding. This week, I worked on improvements and features that I felt should be present before release. The following changes were implemented:

1. Event viewer and logged on user data is included

Primarly, the data set contained information about cpu and memory usage of running processes. I have implemented event viewer details about events that need attention. In addition it is possible to see which users are logged in on the system. The event viewer details were added through the Get-EventLog cmdlet, whereas the logged on users are retrieved from a legacy application named quser.exe.

2. The performance counters became dynamic

The first version of the application had counter information hard-coded into the interface. The visual counters did not represent the data that was flowing through the back-end. The counters represent the data that is transmitted from the requests. This was a tough one as the ChartJS framework has some caveats. The result isn’t perfect from a cross-browser/device development perspective, but I’ll find a way to work around the small bugs that are still present.

5. Authentication and MongoDB

The application can now be logged on to and fully represents a MEAN stack (MongoDB/Express/Angular/NodeJS). I opted for a local authentication strategy. The passwords are hashed and salted from the back-end. I look forward to create new database schemes to store the logging data from the servers.

6. Security

Passwords from the Powershell scripts are now encrypted.

Next milestones:

Templating and routing (CMS)

I started out with a mock-up that turned into a single page application with data visible from one server. At this moment, additional monitoring pages are added by manually changing the Express routing. Given that the amount of web pages depends on the amount of servers that are configured and the routing of these pages should be configured automatically, I need to write a module to create these routes dynamically. Comparable to CMS logic.

Performance tuning

A lot of the data is processed in a synchronous manner. I initially developed the application in this way to keep track of the execution flow. The downside is the fact that requests aren’t ran simultaneously. Some parts of the dashboard load slower than others in practice. Asynchrous behaviour should be implemented.

Custom Powershell scheduling modules

The built-in Powershell task scheduling modules have limitations. Scheduled tasks can not save output. Scheduled jobs can only be ran with an interval of one minute. I currently worked around this issue by writing a simple while loop and using the Get-Date cmdlet.