There's a sense from your response that the article is attempting to imply something.
I didn't really get that.
But a lot of folks here made salient points, for sure.
The article isn't attempting to imply. It's blatantly stating that "good guys with guns" don't help in mass shooting situations. That's the point the article is trying to make. But there are too many confounders to know if that's actually true.
Even in the JAMA study you've linked, it's pretty clear that the authors have concluded that more guns = bad. The question is, did they conclude that from the data, or did they select the data to arrive at that conclusion?
Any time you read one of these studies, you should keep this in mind (emphasis added);
Giving the same information to multiple scientific teams can lead to very different conclusions, a report published today in Nature shows. And that's exactly why two researchers think scientists should share their data with others — well before they publish.
In this experiment, 29 scientific teams were given the same information about soccer games. They were asked to answer the question "Are dark-skinned players more likely to be given red cards than light-skinned ones?" Some scientists found that there was no significant difference between light-skinned and dark-skinned players, whereas others found a very strong trend toward giving more red cards to dark-skinned players. So, even though a pooled result showed that dark-skinned players were 30 percent more likely than light-skinned players to receive red cards, the final conclusion drawn from this exercise — that a bias exists — was a lot more nuanced than it likely would have been if only one team had conducted the analysis.
Crowdsourcing research 'gives space to dissenting opinions'
www.theverge.com
What's interesting here is that 29 research teams given the
same data came to
vasty different conclusions. When someone presents you with a "scientific study" as "proof" of anything, you should have questions. What was the selection criteria? What confounders were identified and how were they controlled/adjusted for? What was the methodology of the study? What data was
not considered and/or excluded in the study?
The first thing I noted about the JAMA study was in the methods, where it stated how they selected which shootings they would include in their analysis.
They state (emphasis added);
We examined each identified case where more than one person was intentionally shot in a school building during a school day or a person arrived at school with the intent of firing indiscriminately (133 total cases) from 1980 to 2019 as reported by the public K-12 School Shooting Database.
Here are two qualifiers that raise questions. They've excluded any shootings where
only one person was intentionally shot. Why? One could hypothesize that shootings where only one person was shot may be more likely to end more quickly if an armed officer is present. Excluding those shootings seems... questionable. Secondly, they included only those who were "firing indiscriminately". So a shooting where a person or group of people was being targeted is likewise not included in this analysis for some reason. Neither of those things (only one victim, whether or not the shooting was targeted or not) has anything to do with whether the presence of an armed rescue officer was beneficial, so it should cause us to question why those incidents were excluded from the analysis.
I have no idea if having armed officers in schools is helpful or not (I suspect it varies largely dependent upon the capabilities/competencies of the resource officers in question), but this study reads more like propaganda than any kind of objective scientific analysis.