Thanks for submitting!
Taught is such a strong word, isn't it? There aren't exactly classes on how to be a feminist. I do understand what you are asking though.
I'd say that there are 2 portions of feminism worthwhile to note. There is the national movement itself, which is super vague and divided, and there is the media style feminism.
Media style, so to speak, seems to be completely fabricated virtue signaling garbage that a lot of people fall for, and most people fall for, as being actually feminist. "All white men are either jerks, simps, nerds, or weaklings" is often characteristic in things like Marvel phase 4 or the new Star Wars shows, for example. The right wing media then takes this and runs with it as a strawman. And now it's divisive, even though it's all built on profit margins by appealling to extreme views.
It is extremely unclear what feminists as a national movement want, except for the general ideas like rape is bad, don't beat your wife, women shouldn't get paid less for the same job, etc. The nit picking among those involved in the movement are annoying to me and distract from the real issues.
Bottom line: I think that most feminists aren't being taught very radical ideas. However, popular media, particularly movies and TV, have purposely hijacked the movement in order to recieve widespread acclaim, but at the cost of radicalizing some members.
It's less about mingling and more about misinformation and following the herd. But I would say that even this amount of radicalization is pretty much harmless as of now. Only time will tell though.