Men have always been turned off by feminism.
I've always thought of feminism as a focus group under the wider umbrella of humanism.
I think there's utility in having separate spaces for the time being to address gendered issues, especially in regards to exploring and setting aside or decoupling from many of the gender tropes we all hold within our own gender groups.
In other words, men to redefine manhood in a way that is not harmful to women and doesn't leave them in existential crises. My personal opinion on that is that they need to reattach themselves to the archetype of being providers and protectors, but get their heads out of their asses about it. Embrace the reality of EVERYTHING that means and keep it real and ditch the action hero cosplay. But that's just my opinion, ultimately that should be up to men and then it will either be accepted or rejected by women (individually). As always, there will be many who settle and ride it out as long as they can stand it.
For women, my personal opinion , that means a lot more things. I think we need to get to defending ourselves against male sexual aggression rather than looking to men for that. They've made it clear they don't care. That's going to mean embracing more physicality and learning to fight as well as other strategies. Raise better sons. Reject all the dichotomies between women. We women define what womanhood means. No one else. Financial literacy is a must. Refuse ALL those damned dating apps. Obviously restore reproducrive rights and I say hold the fucking economy hostage to do it. Full on we burn, you burn with us. The greedy, smarmy bastards.
I tried to fix all the typos but I probably missed a few. I'm on the bus.
Basically, in a social aspect I think feminism should focus on becoming the women we were designed to be.