I actually don't think computers are capable of truth-preservation (validity). You can get them to mimic the symbol-moves of certain formal logical systems, but this isn't possible with natural language because natural language is not a formal language.
Well, we know computers excel at logical verification using formal logic, but the output of LLMs is by itself non-deterministic (although Business and Enterprise users of chatGPT, and I think Pro users also, can make use of an embedded code interpreter and data analysis subsystem which can do things like run python code generated by the LLM, which allows the system to create software on the fly, test it, and execute it.
Outside of the domain of AI, Prolog is a computer programming language specifically written based on formal logic. One programs using predicates and so forth. It isn’t widely used, but it does exist. Most actual logic validation is done using conventional functional-procedural languages. Indeed the VeriLog language used for quite a long time now for designing logic circuits like CPUs, memory, GPUs and so on*, is intended to provide among other things verification of the design logic, although there are numerous other verification steps required as should be obvious if you read my footnote.
However, formal verification of computer
software does have its limits. For example, the Halting Problem, discovered by Alan Turing, - basically it cannot be mathematically verified whether a computer program will halt or continue running indefinitely given a specific instruction set even if you try to create some kind of meta-computer that evaluates all actions of the computer being simulated. The problem is that mathematically, it is simply unsolvable. However, this does not mean a computer program cannot do basic logic verification. Indeed much software we use on a daily basis, and by much I mean virtually all of it, is doing that, in various ways. Math is used for public key encryption; boolean comparisions are used to generate binary opposite patterns of data for various applications from graphics to program logic and compilation, hashing algorithms are used to create encoded representations of data such as passwords or message texts using protocols like SHA-2 or outright, and floating point math is done on GPUs for vector graphics.
*This can be used to generate a digital simulation of the device, or program an FPGA (field programmable gate array**, a dynamically reprogrammable hardware device) or used with complex toolchains from companies like Siemens EDA (formerly Mentor Graphics) to set up workflows to prototype, verify and begin mass production of CPUs, GPUs, ASICs and so on at semiconductor fabrication facilities or fabs, the industry leading fabs being operated by TSMC in Taiwan, and to a lesser extent Samsung and Intel, using $300,000,000 extreme ultra violet laser lithography tools which are the most complex machines in existence, assembled by a Dutch company called ASML with elaborate optics from a German company, Carl Zeiss SMT, and many subsystems from other companies some of which are wholly owned, like Cymer***, and others of which are independent and exist as part of the overall industrial ecosystem required for technology of this scale to flourish.
** FPGAs are not just used to test new electronics but are also used on some hardware platforms to provide dynamically reprogrammable logic circuits. For example, high end networking hardware such as switches, firewalls and so on can include FPGAs to allow for high speed processing for various applications such as blocking denial of service attacks, ultra-low latency network for real time trading, and other purposes. Indeed there are FPGAs affordable to mere mortals which can be used to run open source hardware for various porpuses.
*** Cymer makes the high-powered laser system used to generate flashes of extreme ultraviolet light. The way it works is a molten tin droplet is first deformed into a concave shape (from the laser’s perspective) and then vaporized with two precisely timed pulses, which produces the flash of extreme ultra violet light which has a wavelength small enough so that when used with the proper optics can produce the extremely small details on modern semiconductors, which are just a few nanometers in size. Indeed we’re approaching a point where the features will be so small quantum tunneling is likely to be an issue (this will be a bug, not a feature, because it does not enable quantum computing; quantum computing, if we are ever able to get it to work at scale, requires supercooled hardware which is isolated from the outside environment, but there are concerns about whether or not quantum computers will actually work, how scalable they will be and whether or not they will actually outperform conventional computers. If the many engineering problems are definitively solved, quantum computers will paradoxically give us unbreakable quantum encryption while also breaking most forms of classical encryption. I personally hope quantum computing proves to be a dead-end because my fear is that these systems could divide us between quantum computing haves and have-nots, thus allowing rogue states, totalitarian regimes and non-state actors who have access to the technology to violate the privacy of ordinary citizens or steal their data for various nefarious purposes if that data is passed over an internet connection, which is increasingly the case.