No, the southern colonies did not have the first public schools. The first public schools in the American colonies were established in the New England colonies, such as Massachusetts and Connecticut, in the 17th century. These schools were primarily established to educate children in reading and writing.