Health

From Graduation to professional career: has medicine, in Brazil, become more feminist?

The second term, feminist medicine, refers to a situation of equality and equity between men and women doctors, which doesn't happen nowadays.

Read more @ europeansting.com

Back to top button