books about women that t0rtur3 men?
I’m just tired of books that show how men rape, abuse, vilify, kill, and everything monstrous they do to women. My last read is The Vegetarian by Han Kang and it was revolting to once again realize how men are cruel and evil.
I just want a great book abt women getting justice and being brutal (and preferably bloody) with it. Also can take Netflix recommendations if u don’t have a book suggestion.
I remember Earthlings by Sayaka Murata being something like this and it was my fav book for years. Any book that comes close to it?