When did the mens us soccer team win a world cup?

The US Men's team has never won a FIFA World Cup. The US Women's team has won two FIFA Women's World Cups (1991, 1999).