In many graphics applications it is often necessary to know the distance between two points or pixels, such as collision detection in games. Here, we are just interested in calculating the distance between pixels in 2D. However, it is not that much more complicated to extend this to 3D and we will be covering this at a later date. Anyway, back to our 2D example.
We can determine the distance from one known pixel coordinate to another using some simple trigonometry. In the figure below we would like to calculate the distance from point a to point b. These could be any two pixels on the screen with coordinates (x_{1},y_{1}) and (x_{2},y_{2}) respectively.

The distance from a to b is calculated using Pythagoras’ theorem, as long as the coordinates of the two points are known. First we need to know the vertical difference between the points (y_{2}y_{1}) and the horizontal difference (x_{2}x_{1}) as show below.

Pythagoras’ theorem states that the distance between a and b is given by d^{2} = (y_{2}y_{1})^{2}+(x_{2}x_{1})^{2} where d is the distance between points a and b. As an example, if we have two points with coordinates as given in the figure below then we can find the distance between them as follows.

d^{2} = (7618)^{2}+(18444)^{2}
d^{2} = 58^{2}+140^{2}
d^{2} = 3364+19600 = 22964
d = √(22964) = 151.54
So the approximate distance between the two points is 151.54, or 151 rounded to the nearest pixel! This is a simple example but all the more advanced topics in graphics theory is built on simple calculations such a this.
NOTE: The above formula works fine even if the x, y coordinates for point a are greater than point b. In this case you will get a negative value when you calculate y2y1 and x2x1. This does not matter as the negative value disappears when it is squared.