The Marines
    The Marines

    Synopsis

    For longer than the United States has been an independent nation, there has been a Marine Corps. They consider themselves the very best America has to offer. Embodying fierce patriotism, extraordinary courage, and innovative weapons, they are a force. This documentary focuses on their training and examines what it means to be a Marine.

    You May Also Like

    View More