I am not sure if you read all the posts in this thread, but that is not correct. I will re-post some of the info.
The court filing in the Parler vs. Amazon Web Services case makes the following claims in relation to this:
https://www.courtlistener.com/recap/gov.uscourts.wawd.294664/gov.uscourts.wawd.294664.21.0_2.pdf
And AWS has confirmed that none of the arrested participants in that unconscionable attack (who had been publicly identified as of the filing of this action) even had a Parler account, much less used it to “incite, organize or coordinate” the attack.
After Twitter banned President Trump on January 8, the increased new users and activity caused Parler to go down for seven hours, resulting in a backlog of 26,000 instances of content that potentially encouraged violence. Peikoff Dec., ¶ 13; Matze Dec., ¶ 9. However, over the next two days,Parler was able to systematically remove almost all of this content, which progress was reported to AWS, and within 48 hours by the end of Sunday—when AWS shut Parler down—Parler had removed all but some 1,000 problematic posts. Peikoff Dec., ¶ 14.Parler also identified additional steps it was taking to quickly identify and remove problematic content. Peikoff Dec., ¶ 15. Indeed, AWS had known since mid-December that Parler would be adopting in 2021 an Artificial Intelligence (“AI”) system that would pre-screen inappropriate content, which had shown promising initial results, and there was discussion of Parler adopting AWS’s own AI system.
The woman killed by law enforcement when she forced her way into the Capitol Building, Ashli Babbitt, did have a Parler account, but it had not been used since November. Id. She also had a Twitter account that was in use the day of the riot, January 6, 2020. Ex. D. Even the government claims that the attack was coordinated in part by Twitter.1 And claims reportedly showing Parler users being involved in the Capitol Riot because the metadata on videos uploaded to Parler reveal the location where they were recorded misunderstandthe evidence: Videos recorded by others, shared, and then uploaded by a separate Parler user will still show the original video location, even if the Parler user wasn’t there and didn’t record the video. MatzeDec., ¶ 4.
They were removing illegal content. They had an influx of such posts when Trump was banned on Twitter. The new memberes surging in took out service for a bit. Then they had to clean up the posts. They cleaned up 25 of the 26k such posts in two days, which is rather fast. They were going to finish the others but were removed.
He also indicates they were employing some AI, and in were discussing with Amazon whether to use Amazon's own AI system.
So they were not refusing to remove content.
Nor were threats of violence allowed in their terms of service. From back in June:
Parler’s new policy: No poop, no inappropriate content, no murder
Another rule communicated via Mr. Matze is, “You cannot threaten to kill anyone in the comment section. Sorry, never ever going to be okay.”
“If ever in doubt, ask yourself if you would say it on the streets of New York or national television,” Mr. Matze added.