"you watch a movie where a woman everyone describes as ‘strong’ demonstrates her strength by refusing to take shit, threatening to tear a man’s tongue out of his head if he speaks to her like that again. but this is roleplay - how can everyone not see this is roleplay? she obviously could not tear his tongue out. if they fought, he would win. in the movie, everyone draws back like she’s being scary, and you see it as everyone indulging her in her pretend feeling of having some sort of power."
As a preteen growing up in mainstream feminist culture, I think I used to experience these scenes in movies the opposite way. I only knew physical conflict through movies like this. When conflict was on the news it was discussed, not shown. So you didn't get to see what gender the soldiers were. And my education emphasised that girls were just as good at everything as boys. In my brain, the notion that women can't fight and thus aren't dangerous was in the same category as treating disease with leeches. Silly superstitious stuff people used to believe in the past. My parents did tell me that boys got stronger on average than girls after puberty, but I'd somehow understood this to be a slight marginal difference that didn't matter much outside aggregate statistics. And anyway, the martial arts movies said skill and having a weapon mattered more than strength.
So when the women in movies threatened to rip people's tongues out for disrespecting them, I took that just as seriously as a male character doing it.
As a result, I often greatly disliked the women characters in these movies. They were supposed to be sympathetic and on the side of good, but they threatened people with physical violence at the slightest provocation! The male characters who were supposed to be sympathetic didn't do that. Or if they did, it was a Big Deal and they got a talking to about not falling to the dark side. But the women somehow got to do it with no criticism whatsoever.
Looking back, this is obviously because the womens' threats weren't really considered serious. The actors might try to pretend that they're serious, but the audience doesn't really believe it, and the writers don't either. Their attitudes leak into the story. But kid me thought it was serious. And so these scenes kind of angered and worried me. Was this a weird genre convention, or did it perhaps reflect real world attitudes? Could women in real life put me in the hospital for 'disrespecting them', like insecure thugs, and just get away with it?
EDIT June 12: On reflection, I no longer endorse this post-hoc story about what I thought when I was younger. The supporting memories are too vague and indirect. There's a vibe of something sort of like this, but little in the way of concrete memories of concrete thoughts kid me had about movies to back it up. The evidence does not seem to single out the detailed story I wrote above.