Are the company holding him personally responsible or threatening to limit his use of their service in any way?
They have been demonetizing videos on his channel based on AI etc. for some time. And they have with many channels. This includes progressive channels that are anti-war.
For a time in August of 2017 they locked him out of his channel, citing terms of service violations they did not specify, and indicated the channel would not be reinstated. But then later they did reinstate it. There are a variety of issues for many creators regarding Youtube taking actions against them for vague reasons that often come down to viewpoints expressed.
They have recently started demonetizing more in the wake of the Crowder issue.
He posted about their use of the term Nazi for him on Twitter, and we will have to wait to see what, if anything, he says about recommendation trend data.
They're discussing how their software works, and in the process pointing out that the rhetoric these people use bears similarities to the language used by the white nationalists, which causes the AI to create these links. Why would it be a problem if they put in place custom measures to prevent this happening?
A. It would not be a problem if they prevent it from happening.
B. They called the three Nazis and said they are using dog whistles. Now dog whistles are supposed to be intentional but less overt signals to their audience. But the AI even impacted history channels that just talked about Nazis, and anti-Nazi groups. Yes, it would be good to fix it. BUt to use such terms to describe posters who are not Nazis is not objective.
And if the solution is to not promote certain content creators because of ideology, then they may run afoul of section 230 of the Communications Decency Act under which they receive liability protections as a platform, rather than an editorial publisher.
There is a current bill proposed in the senate to establish an enforcement mechanism to make sure that large tech companies are complying with this.