Americans like to pretend their problems haven't been soundly addressed in a sensible fashion by other developed nations for decades... be it hate speech, gun control or socialized health care. They've been brainwashed by their media... and their government... to believe that any slight change on any of these fronts will result in the end of the bill of rights, the incarceration of innocents, the end of their religion of choice and that evil non-white immigrants will descend upon their newly unarmed selves and rape their families and steal their jobs.