In many graphics applications it is often necessary to know the distance between two points or pixels, such as collision detection in games. Here, we are just interested in calculating the distance between pixels in 2D. However, it is not that much more complicated to extend this to 3D and we will be covering this at a later date. Anyway, back to our 2D example.
We can determine the distance from one known pixel co-ordinate to another using some simple trigonometry. In the figure below we would like to calculate the distance from point a to point b. These could be any two pixels on the screen with co-ordinates (x1,y1) and (x2,y2) respectively.
The distance from a to b is calculated using Pythagoras’ theorem, as long as the co-ordinates of the two points are known. First we need to know the vertical difference between the points (y2-y1) and the horizontal difference (x2-x1) as show below.
Pythagoras’ theorem states that the distance between a and b is given by d2 = (y2-y1)2+(x2-x1)2 where d is the distance between points a and b. As an example, if we have two points with co-ordinates as given in the figure below then we can find the distance between them as follows.
d2 = (76-18)2+(184-44)2
d2 = 582+1402
d2 = 3364+19600 = 22964
d = √(22964) = 151.54
So the approximate distance between the two points is 151.54, or 151 rounded to the nearest pixel! This is a simple example but all the more advanced topics in graphics theory is built on simple calculations such a this.
NOTE: The above formula works fine even if the x, y co-ordinates for point a are greater than point b. In this case you will get a negative value when you calculate y2-y1 and x2-x1. This does not matter as the negative value disappears when it is squared.