A new approach is presented to describe the change in the statistics of the log return distribution of financial data as a function of the timescale. To this purpose a measure is introduced, which quantifies the distance of a considered distribution to a reference distribution. The existence of a small timescale regime is demonstrated, which exhibits different properties compared to the normal timescale regime. This regime seems to be universal for individual stocks. It is shown that the existence of this small timescale regime is not dependent on the special choice of the distance measure or the reference distribution. These findings have important implications for risk analysis, in particular for the probability of extreme events.