I think it's a misunderstood country, to tell you the truth.
People in my area seem to think that:
A) it has completely turned into a revisionist country advocating the idea that the Holocaust is a complete fabrication on the allied countries's part.
B) Even with this supposed revisionist thinking, the Holocaust is not taught in schools in Germany, and thus the younger generations are completely uneducated when it comes to this horrific event.
C) Everyone is Germany is a Nazi.
I never really bought any of this. It was always hard to be an open-minded person during my K-12 years, because the Holocaust was always taught as a really bad event started by the evil Germans. I always felt like there was more to the story, rather than Adolf Hitler just woke up one day in the 30s and decided he didn't like Jews.
So, after taking some courses in college geared solely toward Germany, I feel like I understand it a lot better. The professor who taught the course is well-respected in historical circles, and spent a lot of time over there both with the military and on his own time. So thanks to him, I feel like I have a more well-rounded view of Germany. It's a country that has certainly endured hardship, but has managed to put one foot in front of the other and keep going.
And the people there today are not evil or Nazis or anything like that. At least, not on a large scale basis. They're nice people, some perhaps a bit distrusting of Americans, but overall, it's a great country that has a lot going for it - travelling there is great, and the country, as mentioned, has contributed such awesome things as Oktoberfest.
I don't mean to downplay the Holocaust at all, so I hope I haven't offended anyone. It was a horrific event. I've seen photographs, and watched documentaries, so yes, it was horrible and disgusting and tragic and should, IMO, always be taught and not tiptoed around.
I think my issues lie with the public education system in the States, again in this area specifically, for presenting such a one-sided story. I know that traditionally, those who win wars are the ones who get to write the history, but sometimes, that history becomes too one-sided and the issues that were brewing in Germany before the war actually began need to be addressed as well, because I think then people will really have a well-rounded view. Students here should have been taught not just that the war happened, but WHY it happened.
I'm really sorry, here I am a newbie and just totally jumped on my soapbox and started ranting.
I apologize again and want to reiterate it's not my intention to offend anyone, so please don't take this to heart.
In case anyone hadn't already guessed, I was a history major. I've reduced that to a minor now, but with a focus area in World War II, especially in Europe, but now I'm thinking of finishing my English degree, and while I'm waiting to start grad school, finish a major in history as well.