Overview The majority of working-age adults in the United States have health insurance via their employers. Theory and a growing body of research suggest that when the cost of workers’ benefits go up, these cost increases are borne by workers and their employers. As a result, employer-sponsored insurance creates a link between what happens in…