• Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.

Claims of PhD Level Computer Code Generation Tools

Stephen3141

Well-Known Member
Mar 14, 2023
1,425
552
69
Southwest
✟100,195.00
Country
United States
Faith
Catholic
Marital Status
Private

Apparently, the development of "agents" (AI software products that can generate
computer source code to carry out certain operations) is now a nifty topic
for the large software companies.

What this actually means, is up in the air.

I say that, because of some very basic truths that Computer Science has grasped...

1 No one in Computer Science courses learns algorithm design by coding in
actual computer languages -- they use an abstract logical language called
pseudocode. Generating a specific computer language source code version
of an algorithm, is a PRETTY SIMPLE TASK. Basically, all you need to do is
write a translator. This is not advanced CS, or AI theory.

2 Much computer source code that has historically been generated, is POOR
quality. (Take the MatLab tool, that supposedly could translate MatLab
language into C source code. The quality of the source code was often
awful.)

The quality of the source code produced, depends on the quality of the
algorithmic pseudocode. Crappy pseudocode would produce crappy
source code. The need for PhD level quality pseudocode, is still needed.

If users are going to provide the "pseudocode" through queries, then you
should expect most of the points of misunderstanding of how to reach a
goal, to be reflected in generated source code that has the same design
defects. This is not solving a problem professionally -- this is producing
working computer source code that would have all the defects of the thinking
of the common user.

3 If these AI source code generated tools are used, then companies CAN replace
a bunch of basically unskilled coders. BUT, producing software products that
are all alike, also makes the possibility of hacking them ALL, a danger.

4 Source code generating tools ARE needed, as the average operating system in
a computer has about 5,000 MAN-YEARS of programming built into it. You cannot
simply rewrite an entire operating system, in order to update it. These tools would
allow legions of unpaid coders to do just that, IF the algorithms described are
smart and safe. IF....

By focussing on the translation of an algorithm into source code, the big software
companies are hiding the REAL challenge. And that is designing safe and moral
software packages. I don't see high level tech managers addressing THAT topic.
 
  • Like
Reactions: Vambram

JustaPewFiller

Active Member
Apr 1, 2024
218
178
60
Florida
✟54,943.00
Country
United States
Faith
Baptist
Marital Status
Married
Businesses have been wanting to do this for years and years and years.

Over a the years I've been a part of the several projects where the management hoped the latest gimmick would either generate code from pseudocode or that would take code already written in an old language and re-write it in a more modern language (for example PL/1 to C).

In every case the goal was the same - more speed and less costs because they could fire (or not hire) a bunch of developers.
In every case the end result was the same - it wasn't any faster and cost was the same or more because they had to hire developers to fix the mess of code that as generated by the machine.

Admittedly, some of the pseudo-code to code ones did work fairly well if you stayed within their limits.

AI is the latest "tool" in that cycle and I'm sure it will have some success. It's an interesting thing...

Artists and writers had some success in getting people concerned about AI taking their jobs.

Is anybody going to care when AI takes a bunch of tech jobs?
 
  • Like
Reactions: Vambram
Upvote 0

FireDragon76

Well-Known Member
Site Supporter
Apr 30, 2013
33,426
20,719
Orlando, Florida
✟1,507,165.00
Country
United States
Gender
Male
Faith
United Ch. of Christ
Marital Status
Private
Politics
US-Democrat

Apparently, the development of "agents" (AI software products that can generate
computer source code to carry out certain operations) is now a nifty topic
for the large software companies.

What this actually means, is up in the air.

I say that, because of some very basic truths that Computer Science has grasped...

1 No one in Computer Science courses learns algorithm design by coding in
actual computer languages -- they use an abstract logical language called
pseudocode. Generating a specific computer language source code version
of an algorithm, is a PRETTY SIMPLE TASK. Basically, all you need to do is
write a translator. This is not advanced CS, or AI theory.

2 Much computer source code that has historically been generated, is POOR
quality. (Take the MatLab tool, that supposedly could translate MatLab
language into C source code. The quality of the source code was often
awful.)

The quality of the source code produced, depends on the quality of the
algorithmic pseudocode. Crappy pseudocode would produce crappy
source code. The need for PhD level quality pseudocode, is still needed.

If users are going to provide the "pseudocode" through queries, then you
should expect most of the points of misunderstanding of how to reach a
goal, to be reflected in generated source code that has the same design
defects. This is not solving a problem professionally -- this is producing
working computer source code that would have all the defects of the thinking
of the common user.

3 If these AI source code generated tools are used, then companies CAN replace
a bunch of basically unskilled coders. BUT, producing software products that
are all alike, also makes the possibility of hacking them ALL, a danger.

4 Source code generating tools ARE needed, as the average operating system in
a computer has about 5,000 MAN-YEARS of programming built into it. You cannot
simply rewrite an entire operating system, in order to update it. These tools would
allow legions of unpaid coders to do just that, IF the algorithms described are
smart and safe. IF....

By focussing on the translation of an algorithm into source code, the big software
companies are hiding the REAL challenge. And that is designing safe and moral
software packages. I don't see high level tech managers addressing THAT topic.

A year and a half ago or so I used ChatGPT to help me write code in Visual Basic for Future Pinball, a PC pinball simulation. It was useful in helping me to learn to code, but it made alot of mistakes. But it was better than nothing.
 
  • Like
Reactions: Vambram
Upvote 0

Stephen3141

Well-Known Member
Mar 14, 2023
1,425
552
69
Southwest
✟100,195.00
Country
United States
Faith
Catholic
Marital Status
Private
A year and a half ago or so I used ChatGPT to help me write code in Visual Basic for Future Pinball, a PC pinball simulation. It was useful in helping me to learn to code, but it made alot of mistakes. But it was better than nothing.

I have NO problem with someone using the AI tools for learning. Something
like learning basics about about a computer language, is basic (it's sort of
like using a learning tool to basically learn a foreign language).

But, the ongoing point that I try to repeat about AI, is that it is an emulation
of COMPLEX HUMAN PROBLEM SOLVING. Learning the syntax of a computer
language, is NOT a complex problem, according to Computer Science.

Learning algorithms to solve problems (using computer code) DOES get into the
region of complex human problem solving. Most hardware engineers who know
the syntax of a computer language, know almost nothing about Computer
Science algorithm design. Despite all the protests, to the contrary.
---------- ------------

The new AI tools may replace some human workers, because they could
potentially automate the simpler types of human problem solving.
 
  • Like
Reactions: Vambram
Upvote 0

Stephen3141

Well-Known Member
Mar 14, 2023
1,425
552
69
Southwest
✟100,195.00
Country
United States
Faith
Catholic
Marital Status
Private
I have NO problem with someone using the AI tools for learning. Something
like learning basics about about a computer language, is basic (it's sort of
like using a learning tool to basically learn a foreign language).

But, the ongoing point that I try to repeat about AI, is that it is an emulation
of COMPLEX HUMAN PROBLEM SOLVING. Learning the syntax of a computer
language, is NOT a complex problem, according to Computer Science.

Learning algorithms to solve problems (using computer code) DOES get into the
region of complex human problem solving. Most hardware engineers who know
the syntax of a computer language, know almost nothing about Computer
Science algorithm design. Despite all the protests, to the contrary.
---------- ------------

The new AI tools may replace some human workers, because they could
potentially automate the simpler types of human problem solving.
 
Upvote 0

FireDragon76

Well-Known Member
Site Supporter
Apr 30, 2013
33,426
20,719
Orlando, Florida
✟1,507,165.00
Country
United States
Gender
Male
Faith
United Ch. of Christ
Marital Status
Private
Politics
US-Democrat
I have NO problem with someone using the AI tools for learning. Something
like learning basics about about a computer language, is basic (it's sort of
like using a learning tool to basically learn a foreign language).

But, the ongoing point that I try to repeat about AI, is that it is an emulation
of COMPLEX HUMAN PROBLEM SOLVING. Learning the syntax of a computer
language, is NOT a complex problem, according to Computer Science.

Learning algorithms to solve problems (using computer code) DOES get into the
region of complex human problem solving. Most hardware engineers who know
the syntax of a computer language, know almost nothing about Computer
Science algorithm design. Despite all the protests, to the contrary.
---------- ------------

The new AI tools may replace some human workers, because they could
potentially automate the simpler types of human problem solving.

I've used ChatGPT O1 since then for more complex problems, like creating EQ calibration curves for audio work. O1 is definitely a step up from previous versions of ChatGPT in terms of its ability to reason and solve complex problems.

Some of the music that AI can generate now is pretty good. Not the most astounding in terms of artistry, and derivative, but alot of pop music is frankly not very creative anyways. Remember the Star Trek episode where Data, the android crewmember, makes a poem about his cat? Alot of the lyrics writing of AI reminds me of that. OK, not quite that bad... but it does convey the general impression of something that's very good at grabbing a huge thesaurus and cobbling together rhymes, but lacking a bit of true ingenuity. But it's good enough to replace most of the musical artists out there, I'm afraid.
 
  • Like
Reactions: Vambram
Upvote 0

Stephen3141

Well-Known Member
Mar 14, 2023
1,425
552
69
Southwest
✟100,195.00
Country
United States
Faith
Catholic
Marital Status
Private
I've used ChatGPT O1 since then for more complex problems, like creating EQ calibration curves for audio work. O1 is definitely a step up from previous versions of ChatGPT in terms of its ability to reason and solve complex problems.

Some of the music that AI can generate now is pretty good. Not the most astounding in terms of artistry, and derivative, but alot of pop music is frankly not very creative anyways. Remember the Star Trek episode where Data, the android crewmember, makes a poem about his cat? Alot of the lyrics writing of AI reminds me of that. OK, not quite that bad... but it does convey the general impression of something that's very good at grabbing a huge thesaurus and cobbling together rhymes, but lacking a bit of true ingenuity. But it's good enough to replace most of the musical artists out there, I'm afraid.

I agree that the software is getting better.

But again, the basic CS definition of artificial intelligence is not that
software is getting better, and able to automate many mundane jobs that
human beings now do.

AI is the emulation (by a machine) of complex human problem solving.
Any definition of "complex", will not do.

The advance of programmed software tools, has always been accompanied
(I fudgingly use "always", but the basic point is sound) by people who hail
the new advances as "AI".

Some of the "AI" advances are not advances that humans could ever do (such
as computing millions of math problems a second). Some of the advances were
programmed into a computer by an intelligent human programmer (and so, are
not really advances in machine intelligence).
---------- ----------

I think that the current struggle in the AI developers is to try to emulate
the "abstract" problem solving that human beings are so good at. And
THIS is the aspect of AI tools that is most interesting to me.

I am posting some AI tool "challenge questions" on the thread on Moral-
Ethical models, Christian Morality, and AI, in order to try to tease out
what the new AI tools CAN do, and what they are currently NOT capable
of handling. I have suggested that readers try out the questions on their
favorite AI generative tool, and post the results.
 
Upvote 0

FireDragon76

Well-Known Member
Site Supporter
Apr 30, 2013
33,426
20,719
Orlando, Florida
✟1,507,165.00
Country
United States
Gender
Male
Faith
United Ch. of Christ
Marital Status
Private
Politics
US-Democrat
I agree that the software is getting better.

But again, the basic CS definition of artificial intelligence is not that
software is getting better, and able to automate many mundane jobs that
human beings now do.

AI is the emulation (by a machine) of complex human problem solving.
Any definition of "complex", will not do.

The advance of programmed software tools, has always been accompanied
(I fudgingly use "always", but the basic point is sound) by people who hail
the new advances as "AI".

Some of the "AI" advances are not advances that humans could ever do (such
as computing millions of math problems a second). Some of the advances were
programmed into a computer by an intelligent human programmer (and so, are
not really advances in machine intelligence).
---------- ----------

We "program" our children at schools, and teach them how to solve problems. Does that make them any less intelligent because they didn't figure it out on their own?
 
Upvote 0

Stephen3141

Well-Known Member
Mar 14, 2023
1,425
552
69
Southwest
✟100,195.00
Country
United States
Faith
Catholic
Marital Status
Private
We "program" our children at schools, and teach them how to solve problems. Does that make them any less intelligent because they didn't figure it out on their own?

You're opening up all sorts of new topics, that AI.

Teaching children, is very different than programming a computer.
You need to define what you mean, by "programming" children.
 
Upvote 0