Its not that we're waxing nostalgia on that time period and thinking was everything was a-okay. Its almost as if its okay to take the blindfolds and talk about the reality of the time period and what happened to people who were American citizens (at least I think there were). What is it about that period that we, as a society are wanting to talk about this time period?
I am not saying that the 1960s were innocent, they weren't, but they certainly had a huge impact on not only the United States, but also the industrialized world.
Your thoughts on this are most definitely welcome.