Do Visitors Need U.S. Health Insurance? While health insurance is not always mandatory when traveling to the U.S., regulations may vary depending on the circumstances around your visit and your visa needs. In addition, health plans from different countries are generally not accepted in the U.S., which means you could pay thousands of dollars out-of-pocket…