Hi all, I was just wondering if you think the above is true. I once wanted to work in Thailand but have been put of by some of the reports I've heard recently about the people becoming more violent, less tolerant of others and lots of crime. Not to mention the pollution and a host of other things. Maybe I am being naive and those that work there know better but it would be interesting to hear from people that have worked there for a long time. It doesn't strike me as appealing as it use to be. Any thoughts?