Sex used to be the ultimate taboo in American film – movies featuring explicit sex were considered far more socially dangerous than those showing brutal violence and gore. But recently, audiences (and the industry) have appeared to be loosening up on the issue of on-screen sex. “It would be tempting to think that this was because America was finally getting a bit more grown up about sex, or because the nation at war with itself was ready for some frank hedonism.”