site stats

Sports

Will Americans Finally Care About Soccer?


The United States has done surprisingly well at the World Cup so far. It is surprising because they usually don’t impress but also because existentially there is something missing at home for the United States team: a country that gives a flying rat’s butt about soccer. Oh sure, they care now…a little…because America is doing well and Americans love nothing more than America being awesome. But let’s imagine for a moment a scenario, (however unlikely), in which the United States wins the FIFA World Cup. Would it make a difference in a country that is normally ambivalent about the sport if not downright hostile to it? All around the world, especially in our Latin American countries, people live and breath soccer. But in America we see soccer as something kids play until they hit high school and join a real sport’s team.

It’s hard to say whether winning the World Cup would change attitudes in the long run. When the celebrations of our victory ended would we just go back to ragging on soccer like we always have? Having at least the possibility of winning has certainly got Americans tuning in in larger numbers than ever before. I think it will take some time. Perhaps the kids who are experiencing the U.S. doing well at the World Cup today will grow up to be soccer fans in the future. Maybe their children will one day dream about becoming soccer players instead of basketball or football stars. It will be interesting to see what this World Cup does to American’s love/hate relationship with soccer.

Promoted Content
scrolling="no">

0 Responses to "Will Americans Finally Care About Soccer?"