Throughout history, angles have been measured in many different units. Common units you will encounter in your courses are the degrees (°), radians (rad), and the degree subdivisions known as degrees-minutes-seconds notation (D°M'S'').
The original motivation for choosing the degree as a unit of rotations and angles is unknown. One theory states that it is related to the fact that 360 is approximately the number of days in a year. Ancient astronomers noticed that the sun, which follows through the ecliptic path over the course of the year, seems to advance in its path by approximately one degree each day. Some ancient calendars, such as the Persian calendar, used 360 days for a year. The use of a calendar with 360 days may be related to the use of sexagesimal numbers.
Another theory is that the Babylonians subdivided the circle using the angle of an equilateral triangle as the basic unit and further subdivided the latter into 60 parts following their sexagesimal numeric system. The earliest trigonometry, used by the Babylonian astronomers and their Greek successors, was based on chords of a circle. A chord of length equal to the radius made a natural base quantity. One sixtieth of this, using their standard sexagesimal divisions, was a degree.
Radian describes the plane angle subtended by a circular arc as the length of the arc divided by the radius of the arc. One radian is the angle subtended at the center of a circle by an arc that is equal in length to the radius of the circle. More generally, the magnitude in radians of such a subtended angle is equal to the ratio of the arc length to the radius of the circle; that is, θ = s / r, where θ is the subtended angle in radians, s is arc length, and r is radius. Conversely, the length of the enclosed arc is equal to the radius multiplied by the magnitude of the angle in radians; that is, s = rθ.