I want to further elaborate on what you said @atrum-uumbra, because I believe that is an important topic we haven’t fully addressed. These companies reputation are dependent and are a reflection of their platforms. If Activison is known as a breeding ground for toxicity, hate speech, ect. (which it currently is) it reflects negatively on them and COD as a whole, degrading their image and potentially convincing parents to not allow their kids to play COD. In the same sense, if Techlore was overrun with people promoting (insert bad thing) and there was no moderation actions against these users, then it would lead to Techlore forum itself being seen as (same bad thing) for giving these people a platform. Essentially for platforms, if you aren’t discouraging the behavior, despite not encouraging it, you’re enabling it.
Unfortunately, manual real-time moderation for platforms this large would be nearly impossible and prohibitively expensive. Report systems and automation are essential for solving this issue with such large quantities of users.
As previously expressed, in my opinion, for voice chat, if you are reported, a small segment of your last speech (like the last 60 seconds) should be saved and held for moderator review. Upon review, the moderator should decide if action is necessary and the audio clip should be automatically deleted. This balances privacy and moderation, by allowed action on problematic users while preserving the privacy of users once a moderator reviews it.