MCDaemon a small project with big potential

I play a lot of game with my friends, and due to my technical expertise I also provide hosting for some games. As interests from one game changes to a different game, there has always been a need for me to change the server manually.
Two years ago I acquired an Intel NUC due to a need for an x86 server that could provide services 24/7 without burning though my electricity bills, currently it runs a webserver for filesharing/JenkinsCI deployment, acts as a NAS with its connected 3TB disk and can dualboot into a Linux partition for testing the server setup with BZSmod.

But most of all it still run gaming servers when there is a request for it, this year Minecraft have been quite popular among my friends and with the many modpacks one server just haven't sufficed.
However 4GB ram and Intel Atom doesn't allow for two MC servers at once, so it is necessary to switch between as needed.

My friends and I have many times debated the option of making a system that could allow them to make these switches without the need of my assistance. As PHP exec() didn't work out for enabling our master plan, it has been postponed ever since.

Being available for hire has provided me with enough time to work on a solution that both requires back-end and front-end development and I estimate to have used about 60 to 100 hours work time so far (across a month, not including learning time of languages and technicalities).

It has resulted in a stable solution we originally named MineCraftDaemon, but now would be more fitting to name MyCustomDaemon.
The server provides a webclient that allows for switching between servers, reading the output from the active server and provides the ability to send an input command to the active server. This is handled by a socket connection and manually interpreting HTTP GET and POST requests.

Behind the scene is a random key generator that ensures the POST request to be from a valid state and checks on the POST variable to be within its limits. The POST request is processed inside a mutex to ensure only one request is processed at a time and no queueing of requests can occur. Finally the request is split into variables that are used for shutting down or starting a cmd process with the appropriate arguments.

The GET request returns the string representing the output text if /output was included otherwise it returns the control page. Both are in HTML format.

The good thing about using cmd.exe to call the process we need to start, is that it scales well and allows for a bat file to do all the needed things that might be related to the setup of the server.
The bad thing is that unless "exit" is provided in the end of the bat script and the program, (ex: minecraft) can end it self by a text command (stop or quit), then the web application allows full usage of the cmd.exe, which could lead to abuse or accidents.

Currently the project is stable and could be lunched with a warning that every server should be wrapped inside a bat file with an exit command in the end, and it might be necessary as long as cmd.exe is used for executing the programs.
An alternative way would be to implement that wrapper into the code and hope that it won't cause any limitations.

While this might work well for the explicit task it provides it does not scale very well.

  • What if one wants to provide a printer service though the interface, should the interface allow for a login and session or should the printer service?
    • How do we handle the upload of files?
    • How do we avoid abuse?
  • What if we want to run 2 servers, but not 3?
  • How can we provide hidden services for users with privileges
    eg: could we provide the cmd.exe for the server admin when logged into his/her account, but not to be shown for anyone else. How would that conflict with the server count rule?

MCDaemon will be released as open source on github when the security has been improved and a GUI for manipulating the config file is in place, but resolving the scalability issue could make MCDaemon much more usefull.
Another feature I intend to implement is the usage of Server Sent Events to get rid of the refresh cycle and thereby making the feedback from the service become more smooth. Server Sent Events would also allow for a user to have time to write a longer input command for a service without using copy/paste.

MCDaemon will be added to my project page when available for download.

Under reconstruction

A former backup of this site has been put in place.Missing data will continuously be added to restore the site.

Linux vs. Windows

I'm a windows user, and will still be...

I usually get annoyed when a professor (in the context of a supervisor or lector) comes around and tells me to use Linux. Especially if it the work can be performed on Windows.
This semester required my focus on Linux more than I usually would care about. In the end I am glad that I got some experience with Linux, especially as I acquired a pair of free VPS, which runs with CentOS.

Unfortunately, though Linux is very powerful, as you can "fix"/customize installations and various other scripts, Linux biggest flaw, is probably the lack of standards. In this case I usually prefer to work with a bundle for windows where the needed subtools fit the requirements of the primary tool. (Compiler, Hosting,CPU-lessons, etc.)

Last week, I completed a Lua-script which should shadow the HTTP GET and PUT requests to the Homeport system that we are working with this semester. I was able to use Lua socket for Lua 5.14 in windows and still compile it with Lua 5.2 - but this was not the case on Linux, which caused a lot of trouble verifying the functionality of the script.
I found a beta library of Lua socket for 5.2 in a discussion at, but i would have expected it to be at their official page for LuaSocket.
To make everything worse, it of course didn't like my install of Lua. There several days of dependency fixing ended, as it would require more dependencies that we don't have time for.

Where thousands of libraries are globally stored in system32 at Windows, we usually have a larger problem with duplicated locally stored libraries and thousands of redistribute c++ packages, because a older version of some library is required in case of backwards compatibility is omitted and because developers doesn't trust other binaries for Windows.
The first case is simply stupid, cause a library should allways be backward compatible as much as possible, thus a new library should be made when old functions needs to be omitted due to outdated functionality.
The second case is also stupid, as the first case should be omitted, and official libraries should then could be trusted.

So in Linux you got too little, in Windows too much. (at least you can fix the problems in Linux most of the time... if you have the time)

At times there are compatibility issues between Windows and Linux. That is OK, but the worst example I have seen so far, is Apache's tool to make modules for their http server: They literally make a apxs tool for windows, but you just have to compile it with Perl first. (But not the Perl in Cygwin, you need Active Perl instead) But that is after you fix the Perl script and remove the win_32 requirement if you are on win_64.
You will then (in case you got cygwin installed) have a Windows batchfile that uses the wrong paths to GCC, and the incorrect parameters (after you changes all the / to - so it actually can read the parameters)

And in the end, all you needed was to make a regular .so file though GCC, which wasn't included in the installation.

I do see the purpose of all the power in Linux, but that "freedom" prevents standards, which prevents the developers to create user-friendly installations. Though the installations tries to include the most common places (is it usr/lib or usr/local/lib ?) to include in the installation, it will not cover all, and some might not even check this.
This requirement is trivial, but we can't be sure. Solution: let the user fix the installation if we are wrong. However when we apply that solution to windows, we would be stuck with the Apache solution above in the best case.

This compatibility could become an issue for both Linux and Windows developers. In BZS we will create a server library so Wine isn't needed to host a server for the mod. And clearly i aim to provide Linux compatibility for any future projects where Linux has a purpose.
In this case, it is easy, but what if it was a package? How to create a install which could find Half-life Dedicated Server, and check for the required dependencies? - it's not that simple if you need third-party content, which isn't declared in Linux and perhaps also in a custom folder.

That's probably why I tend to avoid using Linux, as I don't mind having a file system with 3 partitions and 3 min boot time. After all we share one registry database which handle all that trouble after all.

I hope developers will consider to improve this compatibility towards both platforms as we are in need of each others applications and functionalities.

Smart Campus - WiFi position service

This semester I will be working with indoor positioning or technically "Collaborative Localization". I am very happy to be a part of this project as the project with Smart Campus has been quite successful and our contribution to the current research will be very interesting.

We will not be continuing on the actual product, but creating a new client for Android phones, that with the position data from the database in the Smart Campus will be improved by position estimation from other clients. The current idea is to use Bluetooth as the technology for communicating with the clients in the proximity, thus creating a P2P network, which can share estimated positions and improve the estimation for most, if not all of the clients.

These estimations can further be improved by stationary Bluetooth stations which has the same communication features but a higher priority in position estimation, as it isn't moved from its position and would thereby be reliable data in the database.


The above will be our primary interest as most of the group is enrolled with the "Distributed Systems and Network" class, but there are an alternatives that we dare to use from "Machine Intelligence". (not to be recognized as Artificial Intelligence, even though they have a lot in common.)
We could use a sort of collaborative filtering to remove bad data and to stabilize/improve the position by comparing a set of old data.

Example:"You are walking down a hall which has a lot of open space. The WiFi strength can be unreliable due to the open space, but your signals will change depending on the access-points positions."

As the measurement slightly change, and perhaps also enforced by the gyrometer data in most smart-phones, it is possible to look at the last 5 sets of measurement and estimations, to provide a more reliable position, depending on the users movement.


A secondary interest with DSN as the target, would be to use the database as a cloud for both the measurement in large areas such as an airport and to distribute the map data dynamically so the client won't need all the map data to receive the position.

In either case, it is expect that we only proceed with the secondary interests, if we have the time for this.


All these ideas can be combined to give the user a great experience; further more it would be interesting to add navigation feature and Points of Interests to the data, but that is a major MI part that we won't be working with.

This projects will provide experience with:

  • Android applications
  • Bluetooth communication
  • P2P networking
  • Cloud calculations and networking

DRM-Free Thanks!

Developers and publishers should turn their heads towards something else than DRM.

Recently a lot of people have complained about Ubisoft's DRM and therefor Ubisoft changed one of their Ghost Recon projects into a free to play game. This could be a good choice for Ubisoft, as they both can avoid piracy and still earn money from ingame stores. This however might be frustrating for some people who want the whole pack and playing an fair multiplayer. But it seems to be a good marketing for most companies who try to overcome piracy in the best intention for the costumer.

However we don't see a lot of AAA games from Ubisoft to be without DRM, and it can be very frustrating, when you have to encounter various problems from server downtime to changing hardware issues.

Ubisoft is making a good example of why we need to find a new solution to DRM. If they lose 95% of their sale due to Piracy, and adds DRM which pirates overrides, they are just wasting their time to make the game experience worse for the last 5% (and No Ubisoft, you games ain't THAT good).

Trying to focus on smaller games seems to be a good solution, both web-based games like (part of ubisoft) and games like Minecraft which have sold 4.719.837 keys , which was 5$ in the start, but have changed to 20€ because of it's popularity. The point is: that smaller games can be "good enough" to challenges bigger games and in this case probably making a larger income than the big guys.

Perhaps if Ubisoft took their budget a little down, they could make games that isn't so expensive to "lose" to piracy, and achieve both income and DRM-free glorry at the same time.

At least in my oppenion:

Minecraft is making a good example of how to do it right.