Go Back to Top
Definition of western
western - adj. lying toward or situated in the west; relating to or characteristic of regions of western parts of the world; of or characteristic of regions of the United States west of the Mississippi River; of wind; from the west; lying in or toward the west; noun a film about life in the western United States during the period of exploration and development; a sandwich made from a western omelet.
Western on:
Dictionary Google Wikipedia YouTube (new tab)