Commit Graph

1534 Commits

Author SHA1 Message Date
kwaroran
70daaff160 Add llamacpp python scripts 2024-01-12 12:11:58 +09:00
kwaroran
b4b461288d Add Python installation and setup functionality 2024-01-12 12:02:06 +09:00
kwaroran
11ae0266d9 Update version to 1.73.2 2024-01-11 21:50:50 +09:00
kwaroran
18f85a321f Refactor BotSettings.svelte layout 2024-01-11 21:50:22 +09:00
kwaroran
a319eedee1 Add experimental support for Claude AWS 2024-01-11 21:50:04 +09:00
kwaroran
6031ecea32 Fix OpenAI Fixer plugin 2024-01-11 21:41:55 +09:00
kwaroran
1f2ee464d1 Remove unused function isAPNG from parser.ts 2024-01-10 07:09:27 +09:00
kwaroran
264211fdfd Refactor image.ts to use checkImageType instead of isAPNG 2024-01-10 07:08:13 +09:00
kwaroran
fd0de264e9 Update version to 1.73.1 2024-01-10 06:56:00 +09:00
kwaroran
f66d8f66a8 Change FontColorItalic color in highcontrast mode 2024-01-10 06:54:40 +09:00
kwaroran
6b88602673 Fix conditional statement in ProomptItem.svelte 2024-01-10 06:53:38 +09:00
kwaroran
a2f81fefd8 Fix google api key display 2024-01-09 20:03:55 +09:00
kwaroran
3610c2c9c0 Update version to 1.73.0 2024-01-09 00:31:20 +09:00
kwaroran
350bc66851 Add support for WebLLM 2024-01-09 00:27:58 +09:00
kwaroran
1ee7107153 Fix conversion units in metricaPlugin 2024-01-08 16:01:21 +09:00
kwaroran
7db7b71202 Move openrouter settings to BotSettings.svelte 2024-01-08 09:25:37 +09:00
kwaroran
5d7bc64ed1 Fix price calculation in openRouterModels function 2024-01-08 09:23:26 +09:00
kwaroran
414fad13ed Update price calculation in openRouterModels function 2024-01-08 09:14:28 +09:00
kwaroran
f8aba003fb Update version to 1.72.0 2024-01-08 09:13:19 +09:00
kwaroran
a490dbde47 Add new language keys and checkboxes for openrouter fallback and middle out 2024-01-08 09:10:52 +09:00
kwaroran
62a6577368 Add openrouterFallback and openrouterMiddleOut options 2024-01-08 09:09:12 +09:00
kwaroran
aaa5843c51 Add custom chain of thought feature and max thought tag depth setting 2024-01-08 09:03:12 +09:00
kwaroran
ecbbc8e85c Update fileURL in runVITS to Use localModelPath 2024-01-06 23:21:38 +09:00
kwaroran
229d974d75 Rename tfCache 2024-01-06 23:17:33 +09:00
kwaroran
4be7a6762c Update local model path to use remote URL 2024-01-06 23:16:13 +09:00
kwaroran
3a8b09586f fix URL matching logic in initTransformers function 2024-01-06 23:12:49 +09:00
kwaroran
8617257298 Update version 1.71.4 2024-01-06 23:08:15 +09:00
kwaroran
f638866308 Fix error handling 2024-01-06 22:43:59 +09:00
kwaroran
8eccda086d Refactor URL handling in initTransformers function 2024-01-06 22:41:08 +09:00
kwaroran
de17fe3afa Update version to 1.71.3 and add file URL mapping 2024-01-06 22:31:42 +09:00
kwaroran
e974b5ee7f Match URLs to cache server 2024-01-06 22:21:45 +09:00
kwaroran
c3d7210cc3 Refactor caching logic in sw.js and transformers.ts 2024-01-06 21:18:36 +09:00
kwaroran
afd085bd93 Fix tfCache initialization and assignment 2024-01-06 20:50:49 +09:00
kwaroran
46ee706399 Add initialization function for Transformers and update model paths 2024-01-06 20:49:34 +09:00
kwaroran
85cd1cbc65 Add 404 response for 'tf' resource in sw.js 2024-01-06 20:32:19 +09:00
kwaroran
da77f12f7e Update version to 1.71.2 2024-01-06 20:25:31 +09:00
kwaroran
a9872b3d5d Update localModelPath in transformers.ts 2024-01-06 20:25:03 +09:00
kwaroran
1114d5b61b Update version to 1.71.1 2024-01-06 19:46:19 +09:00
kwaroran
229aeced06 Merge branch 'main' of https://github.com/kwaroran/RisuAI 2024-01-06 19:45:57 +09:00
kwaroran
5a305a8f10 Fix sw register temp 2024-01-06 19:45:50 +09:00
kwaroran
cba3ff802c Add selectable tokenizer supports on Ooba (#281)
# PR Checklist
- [ ] Did you check if it works normally in all models? *ignore this
when it dosen't uses models*
- [ ] Did you check if it works normally in all of web, local and node
hosted versions? if it dosen't, did you blocked it in those versions?
- [ ] Did you added a type def?

# Description
I write simple changes on code, which allow user to choose tokenizers.

As I write on https://github.com/kwaroran/RisuAI/issues/280, differences
in tokenizers makes error when use mistral based models.


![image](https://github.com/kwaroran/RisuAI/assets/62899533/3eb07735-874f-46d0-bc0c-c92a32ef927b)
As I'm not good at javascript, I simply implement this work by write
name of tokenizer model, and select one on tokenizer.ts file.

I test it on my node RisuAI and I send long context to my own server.

![image](https://github.com/kwaroran/RisuAI/assets/62899533/5b1f22a0-5b1b-4472-a994-bfe5472ba159)
As result, ooba returned 15858 as prompt tokens.


![image](https://github.com/kwaroran/RisuAI/assets/62899533/6d4c2185-07c9-4de1-8460-0983b6e45141)
And as I test on official tokenizer implementations, it shows 1k
differences between llama tokenizer and mistral tokenizer.

So I think adding this option will help users use oobabooga with less
error.
2024-01-06 19:16:58 +09:00
kwaroran
8ba0d10d56 Update version to 1.71.0 2024-01-06 19:13:57 +09:00
kwaroran
70f25968b5 add transformers remoteHost again 2024-01-06 19:12:18 +09:00
kwaroran
497e0c6dfb Update service worker and remote host URLs 2024-01-06 19:10:21 +09:00
kwaroran
f4d6fcb38c Update characterCards.ts with VITS export support 2024-01-06 07:05:20 +09:00
kwaroran
7344e566f4 Add model selection for VitsModel 2024-01-06 06:45:18 +09:00
kwaroran
66c6511684 Refactor sw.js to support custom headers in registerCache function 2024-01-06 01:28:00 +09:00
kwaroran
0ca4ec3695 Add VITS support 2024-01-06 00:28:06 +09:00
kwaroran
cfdd60bead Remove TransformersBodyType 2024-01-05 23:58:23 +09:00
justpain02
ff4c67b993 Add selectable tokenizer supports on Ooba 2024-01-05 23:57:59 +09:00