List of nursing schools in the United States

This is a list of nursing schools in the United States of America, sorted by state.

A nursing school is a school that teaches people how to be nurses (medical professionals who care for individuals, families, or communities in order to attain or maintain health and quality of life).Bachelor's degree programs or higher: Associate degree programs: !-- Please use proper formatting when adding to this list.

(see Kentucky for example) --> [15] [17] [19]