How America Has Changed Since 9/11

Since the attacks on the World Trade Center, America has had to reevaluate its position in an unfamiliar international and political climate. Regardless of one’s political leanings, one thing is certain: America has changed. Continue reading How America Has Changed Since 9/11

Advertisements