Hurricanes never seem to hit the west coast of the US, why is that?

656 views

Hurricanes never seem to hit the west coast of the US, why is that?

In: Earth Science

12 Answers

Anonymous 0 Comments

When California became a state, they opted for earthquakes and wildfires in place of hurricanes

You are viewing 1 out of 12 answers, click here to view all answers.