What is Left Coast?
1.
A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.
Arnie must feel very alone on the left coast.
2.
Used to describe the obvious side of the map by those who live on the opposite. Otherwise known as the West Coast.
"Hey man, where you from?"
"I'm from Washington."
"D.C.?"
"No, on the Left Coast."
See
Random Words:
1.
Observing ones surroundings to the point or near the point of self exclusion. Often combined with a compulsion to write down or verball..
1.
A group of hooligans who fight for wolverhampton wanderes they are famously known for lureing there enemy in to the underpass which is l..