Resolved bug when entire chat context is deleted and hypaV2Data becomes empty, throwing a undefined index error. Also changed it to typescript index.
Hopefully resolved issue where same chat is summarized over and over again by adding another field to mainChunks.
# Changelist:
## 1. Types
### MainChunks
Added id(int), and chatMemos(Set<string>)
Id: incremental int starting from 0
chatMemos: A set of UUID, containing which chat has been summarized in it.
### Chunks
mainChunkID: A connection of which mainChunk it has been split from
text: the split text data
## 2. Features
### CleanInvalidChunks
Called every time when chat is updated and Hypamemory is used.
Gets all the memo(UUID)s of current chats, and creates a set.
Then checks if each mainChunk's chatMemos set is subset of the enitre memo set. If not, the summarized part's chat is deleted/edited. The mainChunk is filtered out.
Concurrently, the chunks that are split from that mainChunk is also deleted.
Also revoked potentially problematic feature(add hypav2data chunk)
TODO:
1. On mid-context editing, currently that is not considered as deletion. Do have optional editedChatIndex to latter dive in more.
2. re-roll mainChunks(re-summarization) functionalities added, but not able to access it.
Everything works as intended.
when the first chat limit is reached, it succesfully summarize the previous chats according to the chunk size, and then returns the chat with summarized text as a system prompt, with other chats after that summarized point correctly appended.
Automatic sub-chunking also works perfectly.
Tested with sub-model.
back to original logic, only model selection is implemented.
Previous issue: the mainChunks object holds list of summarized texts, but for somewhat reason they are deleted after usage, and then re-added again, summarizing same chats.