Women's fiction is an
umbrella term for women centered books that focus on women's life experience that are marketed to female readers, and includes many
mainstream novels. It is distinct from
Women's writing, which refers to literature written by (rather than promoted to) women. There exists no comparable label in English for works of fiction that are marketed to males.