Memory-less Predictive Algorithm

Finite memory is part of a computational triangle that limits the ability of AI, Compression or anything considered to create a series of best guesses or predictions. Algorithms consist of these factors to come up with, an end result.

In general the theory is that the memory required and computational time impact the quality of the end result. The more memory you have the better quality you can produce which will require more computational time to achieve. If you want a lower quality result then you can lower the memory required and this reduces the amount of time to produce the result.

To produce the highest quality prediction or guess requires a memory-less algorithm and consumes the most amount of computational power of time. This achieves the ultimate possible outcome, with no memory at the cost of an extreme amount of computational expense.

Does such an algorithm exist?

Yes, not in the public domain.

Is it beneficial?

Yes, the, this algorithm can be utilized on today’s current processing power even taking advantage of parallel processing to overcome the limitation that is bound by the length of a data set which exponentially increases the time required to produce an end result.

Is it plausible today?

With today’s computational power along with a leap into quantum computing the advantages of this type of algorithm is limitless as the end result would be unachievable given the modern ratio of memory vs. cpu power allocation. Imagine an AI dataset that is an order of magnitude larger than the current data sets of 100’s of GB size and requires no memory to produce the highest quality.

FacebooktwitterredditpinterestlinkedinmailFacebooktwitterredditpinterestlinkedinmail
This entry was posted in Compression. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

2 × 3 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.