The West Coast is nicknamed the “best coast” for a reason — well, actually, quite a few. The nature found across the West Coast is unmatched elsewhere in the US. From mountains that loom ...