No, patriarchy is not essential to Christianity. While some interpretations of Christianity have traditionally emphasized male leadership and dominance over women, there is nothing inherent in Christian doctrine that requires or condones patriarchy.
In fact, there are many examples of women playing important roles in the Bible and in Christian history. For instance, Mary, the mother of Jesus, is revered as a powerful figure in the faith, and there are many stories of female prophets and leaders in the Old Testament. Additionally, many Christian denominations today ordain women as pastors and allow them to hold leadership positions within the church.
It is worth noting that while patriarchy has been historically present in many Christian societies, this is often due to cultural and societal factors rather than theological ones. In recent years, many Christian scholars and leaders have worked to challenge patriarchal structures and promote gender equality within the church.