Hollywood’s Changing View Of Immigrants?

“Though the American film industry was founded largely by enterprising immigrants and has been fed by successive streams of talented émigrés, Hollywood has generally preferred to depict an idealized, homogeneous America, where the nonwhite and the nonnative linger in the margins and the shadows.”