- Jan 29, 2017
- 12,920
- 13,372
- Country
- Canada
- Faith
- Agnostic
- Marital Status
- Private
One common claim from ID proponents is that evolution's days are numbered and that scientists are progressively turning towards Intelligent Design as an alternative. This is certainly the message that groups like the Discovery Institute like to convey especially via things like their infamous "Scientific Dissent from Darwinism" list.
However, a more meaningful measure of the impact of Intelligent Design on science is available through a couple means. The first is to examine the volume of output of ID related research. The second is to examine the impact of that research on the broader scientific community.
To test this, I examined the ID journal Bio-Complexity. This peer-review journal was first touted by the Discovery Institute back in 2010 as a means whereby scientific papers on ID could be peer-reviewed and published. They claimed the journal would, "accelerate the pace and heighten the tone of the debate over intelligent design". The journal first started publishing articles back in 2010 and has continued since.
I decided to examine what has been published in Bio-complexity. Fortunately it was quite straight-forward since there isn't much content to sift through.
Between 2010 to 2018 (there are no 2019 publications yet) a total of 31 articles were published. These articles were divided into 4 categories: Research Articles (17), Critical Reviews (9), Critical Focus (4) and Tools/Techniques (1).
I used the 17 research articles published in Bio-complexity to examine relative impact. As a measure of impact I use the number of citations each article had. A citation indicates that the article has been cited in another work. In general, the more citations an article has, the greater its impact. I primarily used scientific publication search engines (e.g. ResearchGate, SpringerLink, etc) to source the number of citations of the articles.
One research article was not available through such search engines and therefore I could not determine the number of citations it had. I excluded it from the study. Of the remaining 16, I counted a combined total of 50 citations for all 16 articles. Six articles had zero citations. The remaining articles had between 1 and 12 citations.
I also excluded two self-referential citations in two of the articles. The articles published in 2016 included Genetic Modeling of Human History Part 1 and Genetic Modeling of Human History Part 2. Each of these articles cited the other (e.g. Part 1 cited Part 2, and Part 2 cited Part 1). I removed those specific citations to get a more accurate picture of external citations.
The overall average number of citations for the 16 articles are 3.13 citations per article. The median number of citations is 1.
------------------------------------------------
correction: I inadvertently labeled one of the papers (Model and Laboratory Demonstrations That Evolutionary Optimization Works Well Only If Preceded by Invention--Selection Itself Is Not Inventive)
as having 40 citations. I accidentally counted the number of references from the article itself; the article in question had 0 citations. I have corrected the OP to reflect this. This reduces the average citations per article to 3.13.
However, a more meaningful measure of the impact of Intelligent Design on science is available through a couple means. The first is to examine the volume of output of ID related research. The second is to examine the impact of that research on the broader scientific community.
To test this, I examined the ID journal Bio-Complexity. This peer-review journal was first touted by the Discovery Institute back in 2010 as a means whereby scientific papers on ID could be peer-reviewed and published. They claimed the journal would, "accelerate the pace and heighten the tone of the debate over intelligent design". The journal first started publishing articles back in 2010 and has continued since.
I decided to examine what has been published in Bio-complexity. Fortunately it was quite straight-forward since there isn't much content to sift through.
Between 2010 to 2018 (there are no 2019 publications yet) a total of 31 articles were published. These articles were divided into 4 categories: Research Articles (17), Critical Reviews (9), Critical Focus (4) and Tools/Techniques (1).
I used the 17 research articles published in Bio-complexity to examine relative impact. As a measure of impact I use the number of citations each article had. A citation indicates that the article has been cited in another work. In general, the more citations an article has, the greater its impact. I primarily used scientific publication search engines (e.g. ResearchGate, SpringerLink, etc) to source the number of citations of the articles.
One research article was not available through such search engines and therefore I could not determine the number of citations it had. I excluded it from the study. Of the remaining 16, I counted a combined total of 50 citations for all 16 articles. Six articles had zero citations. The remaining articles had between 1 and 12 citations.
I also excluded two self-referential citations in two of the articles. The articles published in 2016 included Genetic Modeling of Human History Part 1 and Genetic Modeling of Human History Part 2. Each of these articles cited the other (e.g. Part 1 cited Part 2, and Part 2 cited Part 1). I removed those specific citations to get a more accurate picture of external citations.
The overall average number of citations for the 16 articles are 3.13 citations per article. The median number of citations is 1.
------------------------------------------------
correction: I inadvertently labeled one of the papers (Model and Laboratory Demonstrations That Evolutionary Optimization Works Well Only If Preceded by Invention--Selection Itself Is Not Inventive)
as having 40 citations. I accidentally counted the number of references from the article itself; the article in question had 0 citations. I have corrected the OP to reflect this. This reduces the average citations per article to 3.13.
Last edited: