From Executive Producers Kevin Costner and Doris Kearns Goodwin, "Kevin Costner's The West" is a series that provides a fresh look at the epic history of the American West by delving into the desperate struggle for the land itself --and how it still shapes the America we know today.