Sunday, June 12, 2016

When Did White Men Become The Bad Guys in America?

John Hawkins  Posted: Jun 11, 2016

  

All most white men want to do is just live their lives to their fullest potential, just like everybody else without being smeared as the devil because of our skin color and our gender.

No comments: