Comp 2673, Spring 2002
May 20, lecture notes

Today we'll complete the discussion of how to find the distance from a point to an object. To find the perpendicular distance from a point (x_m, y_m) to a line Ax + By + C = 0, use the formula distance = |A*x_m + B*y_m + C|/sqrt(A^2 + B^2) To find the distance from a point to a line segment, first figure out if the point is outside of the part of the plane spanned by the line segment. Say the line segment has length a, and the distances from the endpoints to the point (x_m, y_m) are b and c. If a^2+b^2 < c^2 and a^2+c^2 < b^2, then the two angles are acute, the point lies inside, and you can just use the perpendicular distance. If either of those two inequalities fails, then the point is outside, and take the minimum of b and c to find the distance from the point to the segment.