I talk about why Hollywood is apparently becoming all out feminist now and is casting female actors for all the male lead characters - especially in all the major Science Fiction franchises. Is it a feminist, a leftist or even a communist conspiracy? Are they trying to brainwash our children into genderless creatures? Or is there maybe as simpler and much more plausible explanation?