It’s one of the worst things I’ve ever heard.

Except it just keeps getting worse and worse. I feel so powerless. Americans, in general, want to be the “good guys”. I know I do… but I can’t pretend that things like this aren’t happening. And neither can the rest of the world. It makes me feel so ashamed. I love this country. But I’m sure not proud of it these days. Nor can I see a way out.