LexiGuard - Toxic Comment Classification
Data Science & AI|Purvi Parmar|May 17, 2024
What is the final project useful for?
Combat online toxicity, fostering healthy dialogue. We aim to build an API scoring comments' toxicity, aiding moderators in digital channels. Benefiting companies, NGOs, and individuals, it helps ensure safer online interactions featuring a toxic comment classifier and a dashboard for moderators to manage conversations efficiently.
What does the final project look like?
