First of all, I put this in here because I wanted it to be both semi-serious and light hearted at the same time. Second, and most importantly, I have absolutely nothing against gay people whatsoever.
However, with that said, it seems to me that there is more and more focus on all things gay in media and society lately. There's an ongoing feud between American Idol and the new NBC show The Voice now claiming that AI is homophobic because they don't play up the gay angle while The Voice in it's first two episodes has had 2 openly gay men and 2 openly lesbian women. Does it really matter what the sexual orientation is of someone when they are trying to win a singing competition?
Then there are TV shows where it seems like some of them go out of their way to promote gay characters or story lines. For instance, I love the show Glee, but it seems as the show has progressed, it has been more 'in your face' with gay issues than ever before. At the rate they keeping making the characters gay, there will be more gay than straight characters on the show.
And here in the US, there is a growing trend from the PC crowd and the gay community to have subjects in schools about it? Even with grade school kids.
Lastly on this, and probably the one that bugs me the most is, why is it that the gay community needs it all to be 'in your face' to begin with? They act like if you don't mention someone being gay, then you are hiding it or you are homophobic. Have you ever seen a show where a person or character has to come out and openly say just how heterosexual they are?
Maybe it's just me, I dunno, but it sure seems like it's more and more prevalent every day. Like I said, I have nothing against gay people and they are free to do whatever they want, just like anyone else. But why is there the need to flaunt it or promote it in such an 'over the top' way?