Ever wonder why American news has forgotten about Afghanistan?
I never hear about it when I'm in the states, but it's in the Canadian news regularly. It was constantly in the news when I was in France. Of course, Canadian and French troops are still fighting alongside our troops in Afghanistan and dying like everyone else. But, you knew that right? The angry guy who told you that "those damn Frenchies are utterly opposed to the US and constantly support our enemies" also slipped in "Well, except for all of the support they've given in Afghanistan and Lebanon..." right?
Apparently, the situation in Afghanistan is terrible. Key cities overrun by Taliban forces, various places turned into 'hellholes', girls getting forced out of schools, etc. etc. The good prognosis is that we'll be there for another decade. The bad is that it's already been lost. So, why isn't anyone talking about it in the states?