we are discussing the recent Abortion law in Alabama, and the wider implications of this.
Are women being wrongfully left out of a debate about their own bodies?
Are we going backwards as a society?
Is Abortion the way forward?
What does this mean for the future?