Left Coast

What is Left Coast?


1.

A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.

Arnie must feel very alone on the left coast.

2.

Used to describe the obvious side of the map by those who live on the opposite. Otherwise known as the West Coast.

"Hey man, where you from?"

"I'm from Washington."

"D.C.?"

"No, on the Left Coast."

See map, coast, west coast, east coast


55

Random Words:

1. The pinching or poking of one's ticklish or tender areas; preferably the sides. An unpleasant old woman who pinches others roughly..
1. (noun) A lewd sex act in which an extremely obese and completely naked (or simply pantless) bald man of Jewish descent stands over the f..
1. creating a seal between a human mouth and flesh (commonly the stomach or arm) and then blowing, producing a fart-like sound. Wanna hear..