Why do people refer to thailand as a place to meet women who work in bars and sell themselves?
It makes me sad to see such an amazing country use this angle.
Ive met so many nice thai women who are honest and decent. Im not blaming anything.
What is the cultural difference?
People selling themselves in this country (UK) is frowned upon.
Why does thailand have this stigma?
I dont understand the logic to it. Yes thai people are decent people. and they work. I realise that there is no social support.
Where does this culture come from? I do apologise about maybe being provocative. Im just curious.
but then again i dont have to contemplate this.