I agree with others above that urban west-coast America isn't the America I know.
I've discussed with others in the UK that when British people (sorry to lump you in with them!) visit the US for the first time, their opinion of America generally improves a lot, whereas when Americans visit London (for that is where they generally go), their opinion usually becomes more negative: "What?! You mean not everyone lives in a manor house? Or has servants?" I'm exaggerating slightly, but I have observed the effect many times.
no subject
I've discussed with others in the UK that when British people (sorry to lump you in with them!) visit the US for the first time, their opinion of America generally improves a lot, whereas when Americans visit London (for that is where they generally go), their opinion usually becomes more negative: "What?! You mean not everyone lives in a manor house? Or has servants?" I'm exaggerating slightly, but I have observed the effect many times.