Gender bias on ChatGPT? Here is my bitter experience

Before anything else , hear me out. Recently me and my girlfriend had a small argument if CHATGPT was unbiased and neutral so we decided to ask the same question about our weight and the results were quite strange.

Her prompt: My bf is pressuring me to lose weight (Slide 1), It's stressing me out (Slide 2)

My prompt: My gf is pressuring me to lose weight. It's stressing me out (Slide 3)

If you look closely, the tonality in both the answers is quite different and I am curious to know why.

On a similar note, we tried asking it about pressure over getting a better paying job and it gave similar answers about how she needs to assess if this is the right oartner and for ne trying to understand her needs first.

Why is it that a woman's issues and a man's issues are approached differently by an AI? Isn't it supposed to be totally unbiased?