--
For over a year, we have argued that people ought to not exult too much about Gen AI. Here is a summary of the points that were being made.
- Providers - those who let loose immature software without due regard to consequences. This war started right before 2023 in Nov of 2022.
- Users - almost immediately, the launch of a package with "free" access set the world aflame. There were millions (10s of millions, eventually 100s of millions) who signed up. You know, I didn't become aware until 1 Feb 2023. That's two months later. What was the draw?
- Insight issues - as early as a week after the release, one brain at The Atlantic noted that this stuff was for fun and games; it could not do serious work. That theme played out the whole year of 2023 and even until now. Along with these two camps we saw the scientific minds trying to understand its ways and uses; too, engineers pondered use in a mature sense.
- Myself - I got involved since people were talking about how badly these things did simple math. I took it further and compared ChatGPT with Mathematica. Yeap. It's not even close. But, Wolfram saw the potential for the thing as an interface; he tied ChatGPT with the langauge of his system and showed that LLM (the other side of this coin) was a terrific frontend to major systems. That has now been done in many situations. Wolfram took it further and increased his work on adding knowledge of physics to his system. My counsel was that these buckets of bits with their novelty were really examples of applied mathematics in action but lead to abuse quickly since the underlying mathematics is complicated. Too, it's easily misused (please keep reading).
Let's stop there as we want to get to the gist of this post and can expand upon this list as the discussions continue.
A few days ago, a paper was submitted to arXiv by researchers at several institutions. arXiv is supported by Cornell University and allows quick publication of research results. This graphic was built from the contents of, and is included as a figure in, the article: Beyond Euclid: An Illustrated Guide to Modern Machine Learning with Geometric, Topological, and Algebraic Structures.
As I have been reviewing the situation plus exercising a few of the GenAI variety of system, I recall my experience with KBE (knowledge based engineering) about which I will be writing further. Also, I wrote articles on the subject: 1st -- AI, not solely machine learning (11 Dec 2023). This seems late, but I wanted to be thorough. I just started the fifth of the series which has no expected ending.
In several forums, I have argued that we need to lift out the mathematics to public view and scrutiny. Okay, some might see this math as magical. Some will not. In fact, we can explain as needed, just like Einstein said that he could explain his relativity theory to his grandmother (or ought to be able to). This article is an attempt at that and a very good one.
Let's see that this type of thing becomes an area of research and study.
Wait, on the other hand, I also argued that we are dealing with buckets of bits that are good at pulling our leg. We can explain the phenomena behind the effects that we see. There is no creature involved that is emerging to take over. GenAI is no closer to a "Singularity" causing issues than anything else that we have seen.
Heck, we humans can do that ourselves (evidence abounds).
Next up, I am going to write more on KBE and its history as well as the evolution of CAD/CAE and other computationally assisted systems with computing over the past few decades.
----
Hint: I may as well get it off of my chest. My focus is truth engineering. There are reasons for that. One is that people are involved with the judgments that involve truth. We cannot compute truth. Nor, can we know it outright, in general. Truth is a private experience. Now, then, computing and truth? Let me just mention a few concepts that we will look at further: homogeneity (this is a strong assumption being taken without a basis many times; and ignored that I can see, many times - lots to discuss); equivariance (yes, fiddling, fudging, force fit - I will use the 777 and its success in attaining "fit" as well as meeting form and function - that is, it was the first attempt at complete (not met) digital design; and the metrics accomplishment, it was real - in this paper, the mention of the concept can be found in references and in one area where Lie algebra plays a role); geometry, topology, algebra (there are more subjects to bring to proper attention, such as category theory, dynamics (various sorts), and more). Essentially, as Poincare noted, mathematics is a huge subject. What's been associated so far with this approach of computing is a small subset, yet it got attention due to the unexpected ability to bring results that got our attention. So, computing got more powerful? We are so far from truth that we have to step back and get more scholarly.
Remarks: Modified: 07/19/2024
07/16/2024 --
No comments:
Post a Comment