The domain-independent universal Normalized Information Distance based on Kolmogorov complexity has been (in approximate form) successfully applied to a variety of difficult clustering problems. In this paper we investigate theoretical properties of the un-normalized algorithmic information distance dK. A simple modification makes it an exact metric without the usual additive slack. The main results are that dK is not an Euclidean distance, but is Euclidean on some arbitrarily large finite subsets of {0,1}. The results may not feel very impressive. The main contribution of the work so far is to provide the necessary framework and tools for finding more (interesting) properties of dK in future, and to state some open problems.