I would argue that the big reason Christianity has lost moral authority is because visible Christians were so consistently on the wrong side. Of course this isn't true of all Christians, but the Bible was used to defend slavery, racial discrimination, gender discrimination, and discrimination against gays and transgenders.
Until fairly recently, Christians tended to be associated with moral advances. That was surely that case in Roman times, and I think also in absorbing the "barbarians" after Rome. Even in more recent times, Christians took the lead in caring for sick and others who couldn't care for themselves.
Until recently, polls shows that non-Christians might think Christians were deluding themselves about God, but at least they were good people. Today for the first time, non-Christians tend to think of Christians as opposing moral standards.