English Wikipedia - The Free Encycl...
Download this dictionary
Women's fiction
Women's fiction is an umbrella term for women centered books that focus on women's life experience that are marketed to female readers, and includes many mainstream novels. It is distinct from Women's writing, which refers to literature written by (rather than promoted to) women. There exists no comparable label in English for works of fiction that are marketed to males.

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License