Left Coast

What is Left Coast?


1.

A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.

Arnie must feel very alone on the left coast.

2.

Used to describe the obvious side of the map by those who live on the opposite. Otherwise known as the West Coast.

"Hey man, where you from?"

"I'm from Washington."

"D.C.?"

"No, on the Left Coast."

See map, coast, west coast, east coast


55

Random Words:

1. a fucking uncle fucker oh oh oh oh oh oh oh oh colin See SAM..
1. Bill I turned on he tv and much to my horror there was the Dirty Cosby! See racist, tv, pudding, black, horror 2. When your doing a ..
1. egyptian meaning: a place in heaven where the sun rises. it must beautiful to live in neth... See place, heaven, sun, egyptian..