perf3ct
|
14705eb1c5
|
split up sendMessage into its own service
|
2025-04-02 19:14:26 +00:00 |
|
perf3ct
|
caada309ec
|
try using XML tags in sending to LLM, so it can more easily pick out information
|
2025-04-02 18:57:04 +00:00 |
|
perf3ct
|
6e8ab373d8
|
use highlight.js in code_handlers where possible
|
2025-04-02 17:38:28 +00:00 |
|
perf3ct
|
b7d5d926f7
|
centralize all formatter prompt strings
|
2025-04-02 17:29:53 +00:00 |
|
perf3ct
|
fde644a432
|
remove commented imports
|
2025-04-02 17:26:32 +00:00 |
|
perf3ct
|
c500300267
|
this can be much faster
|
2025-04-01 21:44:54 +00:00 |
|
perf3ct
|
ed52d71729
|
do a better job at centralizing json extraction, and query "enhancer" search queries
|
2025-04-01 21:42:09 +00:00 |
|
perf3ct
|
5b3dca88d9
|
fix interface issues
|
2025-04-01 20:55:58 +00:00 |
|
perf3ct
|
49076e3cf6
|
clean up unused options
|
2025-04-01 20:38:03 +00:00 |
|
perf3ct
|
afe1de5ed3
|
get rid of silly ollamaIsEnabled
|
2025-04-01 19:41:30 +00:00 |
|
perf3ct
|
9719859a39
|
centralize constants for message formatting
|
2025-04-01 19:33:53 +00:00 |
|
perf3ct
|
154d2905fa
|
actually undo translations in hierarchy.ts for now
|
2025-04-01 18:51:37 +00:00 |
|
perf3ct
|
2db0ff2462
|
move prompt constants from JS to TS
|
2025-04-01 18:49:37 +00:00 |
|
perf3ct
|
afd16c22b7
|
make all hierarchy.ts strings translateable, and centralize them
|
2025-04-01 18:48:39 +00:00 |
|
perf3ct
|
f2cb013e14
|
dynamically adjust context window sizes based on conversation context
|
2025-03-30 22:13:40 +00:00 |
|
perf3ct
|
29845c343c
|
move translation strings for AI Chat Note type
|
2025-03-30 21:28:34 +00:00 |
|
perf3ct
|
c046343349
|
fix voyage.ts typescript issues
|
2025-03-30 21:03:27 +00:00 |
|
perf3ct
|
614d5ccdd3
|
move from using axios to fetch in llm services
|
2025-03-30 21:00:02 +00:00 |
|
perf3ct
|
dd9b37e9fb
|
move query decomp strings to their own file
|
2025-03-30 20:08:27 +00:00 |
|
perf3ct
|
997edd8de8
|
clean up anthropic constant locations
|
2025-03-30 19:50:16 +00:00 |
|
perf3ct
|
a5488771ae
|
fix showing percentage of embeddings that are completed
|
2025-03-30 19:43:10 +00:00 |
|
perf3ct
|
40bbdb2faa
|
fix chunking imports again
|
2025-03-30 19:41:31 +00:00 |
|
perf3ct
|
6b86bf93ae
|
fix import paths in chunking
|
2025-03-30 19:35:07 +00:00 |
|
perf3ct
|
def28b1dcd
|
migrate to a pipeline approach for LLM chats
|
2025-03-29 21:31:33 +00:00 |
|
perf3ct
|
8497e77b55
|
fix linter errors in providers
|
2025-03-28 23:27:40 +00:00 |
|
perf3ct
|
2311c3c049
|
centralize LLM constants more
|
2025-03-28 23:25:06 +00:00 |
|
perf3ct
|
224cb22fe9
|
centralize prompts
|
2025-03-28 23:07:02 +00:00 |
|
perf3ct
|
72c380b6f4
|
do a wayyy better job at building the messages with context
|
2025-03-28 22:50:15 +00:00 |
|
perf3ct
|
ea4d3ac800
|
Do a better job with Ollama context, again
|
2025-03-28 22:29:33 +00:00 |
|
perf3ct
|
2899707e64
|
Better use of interfaces, reducing useage of "any"
|
2025-03-28 21:47:28 +00:00 |
|
perf3ct
|
005ddc4a59
|
create more interfaces to decrease use of "any"
|
2025-03-28 21:04:12 +00:00 |
|
perf3ct
|
44cd2ebda6
|
fix requeue errors
|
2025-03-28 20:37:09 +00:00 |
|
perf3ct
|
5456ac32ef
|
set up embedding similarity constants and similarity system
|
2025-03-26 23:12:45 +00:00 |
|
perf3ct
|
a7cafceac9
|
more heavily weigh notes with title matches when giving context to LLM
|
2025-03-26 23:05:16 +00:00 |
|
perf3ct
|
7c519df9b5
|
fix prompt path import
|
2025-03-26 19:12:05 +00:00 |
|
perf3ct
|
713805394c
|
move providers.ts into providers folder
|
2025-03-26 19:10:16 +00:00 |
|
perf3ct
|
5869eaff9a
|
move more constants from files into centralized location
|
2025-03-26 18:08:30 +00:00 |
|
perf3ct
|
a50575c12c
|
move more prompts to the constants file
|
2025-03-26 18:01:20 +00:00 |
|
perf3ct
|
c49883fdfa
|
move constants to their own files and folder
|
2025-03-26 17:56:37 +00:00 |
|
perf3ct
|
44b6734034
|
anthropic works
|
2025-03-26 04:13:04 +00:00 |
|
perf3ct
|
9d29ff4a6c
|
don't spam the logs if a provider isn't enabled
|
2025-03-24 21:13:54 +00:00 |
|
perf3ct
|
150b0f0977
|
remove isEnabled from embedding providers
|
2025-03-24 20:35:46 +00:00 |
|
perf3ct
|
0707266dc1
|
reset embedding_queue where objects are "isprocessing"
|
2025-03-20 22:17:04 +00:00 |
|
perf3ct
|
c9728e70bb
|
also extract Note relationships and send as context
|
2025-03-20 19:50:48 +00:00 |
|
perf3ct
|
915c95f7cb
|
more aggressively filter notes out that don't work for us
|
2025-03-20 19:42:38 +00:00 |
|
perf3ct
|
1be70f1163
|
do a better job of building the context
|
2025-03-20 19:35:20 +00:00 |
|
perf3ct
|
9c1ab4f322
|
add to base prompt
|
2025-03-20 19:22:41 +00:00 |
|
perf3ct
|
273dff2a34
|
create a better base system prompt
|
2025-03-20 19:11:32 +00:00 |
|
perf3ct
|
eb1ef36ab3
|
move the llm_prompt_constants to its own folder
|
2025-03-20 18:49:30 +00:00 |
|
perf3ct
|
e566692361
|
centralize all prompts
|
2025-03-20 00:06:56 +00:00 |
|