Why do companies provide health insurance in the US instead of just increasing wages by equivalent amount?

226 viewsOther

Is it because of tax benefits or something similar? If so, couldnt the government provide the same tax benefits to individuals to make healthcare insurance more affordable to everyone?

disclaimer: I am not from the US

In: Other

15 Answers

Anonymous 0 Comments

Because health insurance companies own the production and transportation of medical equipment and skills. They also give lots of money to politicians to keep it this way.

You are viewing 1 out of 15 answers, click here to view all answers.