Thursday, November 30, 2023

Technical review

TL;DR -- Since we are taking a focus on technology as a thing deserving of attention, we will have to be regular in posting information. Expect that more regularly in 2024 as we commemorate the arrival of Thomas and Margaret (or not, depending upon our research)

---

We ought to have these on a regular basis and will starting next year. For this post, we'll look at a New Yorker article, briefly. Then, we'll add in a comment about a recent post. After that, we'll show a graphic with regard to one of our web servers. Then, we'll start a list of the web presence plus gather all of the technical posts together under various categories. 

Ambitious task? Yes, we'll start today and finish this up with a week. At the same time, we will consider how to do these and with what frequency. 

---

AIn't and AI. We'll be more specific next year. 

  • The Godfather of AI - See the New Yorker, 11/17/2023. This is the main article and dealt with an interview of Hinton. There was a genealogical notion brought up. He is Brit and a descendant of Rev. Boole of the logic and algebra that we all love. After casting about for some direction, he picked neural nets. One early act was popularizing the Boltzmann Machine. As we all know, none of the machines/methods found so far is all-powerful. Subsequent work ended up with the back-prop algorithm. In a sense, this is similar to numeric processing oriented toward resolving a multiple-body issue. Definitely, constraint satisfaction applications need a good look. One interesting tidbit is that the author of this article used Kafka's worlview as the basis for an example. This was written up in an issue of the 1986 Nature periodical. To note, please. On seeing an interaction with ChatGPT, he was astonished so as to talk "level of understanding" and even uttered "alive" in the context. He has seen lower-level reality in that his later work deals with neuromorphic approaches. 
  • The Economist as example - See the last post: Science and AI. This was motivated by seeing an article in the 11/25/2023 issue of the paper (not a magazine, they say) in which a reporter hypes some good work dealing with rogue waves. Now, everyone ought to be interested as waves are everywhere and densely sought by thinkers. Yet, in terms of the seas, this is old research with lots of data. Too, people have done an exemplary job in trying to understand the data. So, the researcher used the neural net to look at some pre-processed data where the mathematical elements were emphasized. Okay. Good results. But, a genetic (to be discussed) approach was about as capable. The researcher had a disclosurer at the top of his report. Did not TheEconomist writer not see this? Too, there is discussion about next steps. Our gripe? The use of AI as it encompasses much more than machine learning. Now, the post? Links to the data and the paper and the code itself which is at GitHub. That is how things will be, more or less, as research goes further. 
---

In some cases, we can use the facilities provided by the servers: Google; WordPress; Quora; FB. But, for our server, we use a Linux-based shared server. We think of it as balancing reliance on the cloud. Lots to discuss there. 

Related to: TGSoc.org/papers

What we see are six metrics. The world has gone mad numerically, many ways. So, that, too, will be discussed. But, with respect to the flow of activity, the topic was motivated by OpenAI's little trick last November. They didn't do the world a whole lot of favors; rather, we will see, in less than two years, just how negative the impact might have been. Now, will subsequent activity on their part relieve some of this. 

An adage is apropos: one cannot train out the crap that was trained into a system via machine learning. 

At the "Papers" site, we put out an article in May and then followed in the latter months. 

---

Aside: John retired as a Technical Fellow having worked in advanced computing systems most of his career. As such, he dwelt in the space between applications and the underlying technology, principally with regard to data management (early data science) and computational mathematics (in the space of engineering support). When he says, buckets of bits, it's with experience. Knowledge and intelligence? Those were themes in the advanced crowd all during the evolution of computing as we know it now. Or, actually, as we do not know it. Think black boxes and their mysteries? They were created by us. To quote an author who is aware of our work: demon of our own designs. 

---

Punting down the road, we have this blog. Plus there are two on WordPress. Then, our website is hosted on Web Hosting Hub. We started with Microsoft's Open Office (need to find the specific name) and moved when MS pulled the plug. That choice dropped the support for many small businesses who had tried to leverage that capability for their on-line needs. See this search: Configuration. You see, this is paired with "Content" or absence of it (which is very much the case in lots of web stuff). The timeframe was 2012 which was two years after we started. At our portal (to truth), we detail our research with respect to rolling our own. At the time, we were astonished by the amount of work done by those who have the time with permutations without end being made available. Interesting. 

Permutations? Sure, group theory comes to mind. It'll be in the background as we proceed. One problem with AIn't? There is no AI. We're talking sophisticated mathematics in action. So, let's raise that level to where we can get the general populace on board with the future. After all, leaving these things to wizard's is problematic. Did we not all learn that over the past two decades? 

Remarks: Modified: 11/30/2023

11/30/2023 --

No comments:

Post a Comment