Islam gave women rights


With regard to the post "'In no Arab land do women have dignity'",  it is Islam that gave women rights. Women in the West are like a commercial item. There is no advertisement from tires to floor tiles without a scantily dressed woman. Muslim women have freedom within the limits established by the Almighty. Slave brides and honor killings are not Islamic. They are tribal customs not acceptable to Islam and strongly condemned.

Ismaeel. Marikar, Online response

Heaven lies under the feet of thy mother is what I have always been told and I have believed and practiced it. The mother of a man is a woman. How can there be anything like a mother? How can any man dare to show disrespect to his mother? How can a man show disrespect to a woman?

Genius, Online response

Picture Source:

Leave a Reply