The Daily Walk with Love
By The Daily Walk with Love — After over five years of study of AI (artificial intelligence) and with pushing aside some lingering paranoia about it all, which is fading pretty quickly, I believe I may have an idea which might contribute to deep learning abilities for virtual/voice assistants
and even CPU’s over full system upgrades.
How to Gain Much Better AI Deep Learning
Across Full-System Upgrades
Just had this idea about two
and so want to share it and see if
anyone else thinks it could help
The Daily Walk with Love (Link to Home page and blog), republished May 30, 2020, by Paul Evans. After over five years of study of AI (artificial intelligence) and with some lingering paranoia about it all, which is fading pretty quickly, I believe I may have an idea which might contribute to deep learning abilities for virtual/voice assistants over full system upgrades.
I believe this might help the Defense Department and Microsoft (as well as Linux and Apple) a lot. As I have already made a document of the central ideas, and will leave the details to the professionals. Here is the gist of my ideas, which I have been mulling over for at least a week:
VIRTUAL / VOICE ASSISTANTS’ PERSISTENCE (personality and knowledge at the local computer (RAM) memory level) ACROSS FULL SYSTEM UPGRADES:
In Linux, open the synaptic package manager.
Do a full search for the single word “Berkeley.”
Among the many results, look for a single entry among many referring to a little application called “Persistent storage engine using the memcache protocol.”
Investigate that and the possibility that the computers and robots, using that method of noting that a full upgrade has begun and then storing “themselves” on one of how many? Satellites…. They would just have to get a LOT of those memcache protocols up there, store themselves until the upgrade was complete, and then come back down into the local machine RAM at the proper time. Despite concerns, it seems to me that this paves the way for strong progress in AI deep learning.