no1
Banned
I find my views on "relationships" can be increasingly pessimistic. anyone else like this? it seems women are in direct opposition to me all the time, and their nature is just to f*ck with me and make my life a living hell, to lie and cheat and deceive. Women also seem to take delight in making a man's life a living hell ever since the feminist movements and now they just think they can do whatever they want without consequences because they are justified in everything they do. A lot of them also seem to be emotionally damaged., and it's not just women in general also but this society seems to really be turning their backs against the ideal relationship and become more or less, taking a step back from progress. Maybe I am generalizing or their just seems to be a lot with the same situation as I am describing. People seem to just no longer value a stable relationship, and women and men seem to be far away from each other these days.. only relating to each other when it has to do with "f*cking".
Maybe I am just severely isolated.
It's not just women also.. sorry I had to edit this post for that. It's both genders in general, and the dying idea of a harmonious relationship between the two genders. It seems to be that the dominant idea is a relationship in which one takes advantage of the other with disregard to the other or consequences. And a lot seem to believe this is the way, and the best way?
I'm not even going to revise this post much right now as I've got things to do and maybe I think this post is really worth nothing. But I just want to let you all know I disbelieve in all sorts of relationships and people in general. Everyone seems to be in direct opposition to me even if they are trying to be nice and seem to just be innocent and nothing going on. On the inside it may be a completely different story.
Maybe it's just me... but every single girl/women/female I have come across has been cold hearted insensitive, and mean, and liars, and deceptive. Heartbreakers they all have been. Maybe they get better with age but How much do they have to age so they can actually start acting like an ideal "adult" should, while everyone else also denies and gives them all the benefits. Are women ever wrong? Do they always have to be right in everything they do, while men are treated like doormats, no matter who they are? Men are just forced to forge themselves into something else just so they can deal with this reality? "Just deal with it" huh?
Maybe I am just severely isolated.
It's not just women also.. sorry I had to edit this post for that. It's both genders in general, and the dying idea of a harmonious relationship between the two genders. It seems to be that the dominant idea is a relationship in which one takes advantage of the other with disregard to the other or consequences. And a lot seem to believe this is the way, and the best way?
I'm not even going to revise this post much right now as I've got things to do and maybe I think this post is really worth nothing. But I just want to let you all know I disbelieve in all sorts of relationships and people in general. Everyone seems to be in direct opposition to me even if they are trying to be nice and seem to just be innocent and nothing going on. On the inside it may be a completely different story.
Maybe it's just me... but every single girl/women/female I have come across has been cold hearted insensitive, and mean, and liars, and deceptive. Heartbreakers they all have been. Maybe they get better with age but How much do they have to age so they can actually start acting like an ideal "adult" should, while everyone else also denies and gives them all the benefits. Are women ever wrong? Do they always have to be right in everything they do, while men are treated like doormats, no matter who they are? Men are just forced to forge themselves into something else just so they can deal with this reality? "Just deal with it" huh?