Left Coast

What is Left Coast?


1.

A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.

Arnie must feel very alone on the left coast.

2.

Used to describe the obvious side of the map by those who live on the opposite. Otherwise known as the West Coast.

"Hey man, where you from?"

"I'm from Washington."

"D.C.?"

"No, on the Left Coast."

See map, coast, west coast, east coast


55

Random Words:

1. The Head Man at Urban Dictionary who dictates what entries are accepted or rejected. Urban Dixie: Please Mr. Urban Dictator, I'll ..
1. A term commonly used for a couple of people who tend to be smug asses, or think they know everything. The two most subjected people to t..
1. The tendency of European women to let almost anyone's penis enter their vagina. Fashionista #1: Honestly, I think he would take wh..