perf3ct
|
b00c20c357
|
Merge branch 'develop' into ai-llm-integration
|
2025-03-24 21:16:20 +00:00 |
|
Elian Doran
|
cf874b5ee8
|
feat(mermaid): add basic support for vertical layout
|
2025-03-22 10:27:42 +02:00 |
|
Elian Doran
|
28c51cb38a
|
refactor(client): use webpack for mermaid
|
2025-03-22 02:15:09 +02:00 |
|
perf3ct
|
34940b5258
|
Merge branch 'develop' into ai-llm-integration
|
2025-03-20 19:52:01 +00:00 |
|
Elian Doran
|
b44bb4053c
|
refactor(deps): use webpack for jsplumb & panzoom
|
2025-03-20 21:51:03 +02:00 |
|
perf3ct
|
1be70f1163
|
do a better job of building the context
|
2025-03-20 19:35:20 +00:00 |
|
perf3ct
|
eb1ef36ab3
|
move the llm_prompt_constants to its own folder
|
2025-03-20 18:49:30 +00:00 |
|
perf3ct
|
e566692361
|
centralize all prompts
|
2025-03-20 00:06:56 +00:00 |
|
perf3ct
|
db4dd6d2ef
|
refactor "context" services
|
2025-03-19 19:28:02 +00:00 |
|
Panagiotis Papadopoulos
|
feb9fa03c3
|
chore(deps): move mind-elixir related packages to devDependencies
webpack takes care of bundling these, so we don't need the modules at runtime anymore
|
2025-03-19 20:03:24 +01:00 |
|
perf3ct
|
352204bf78
|
add agentic thinking to chat
|
2025-03-19 18:49:14 +00:00 |
|
perf3ct
|
08f7f1962b
|
do a better job with similarity searches
|
2025-03-18 00:50:55 +00:00 |
|
perf3ct
|
c37201183b
|
add Voyage AI as Embedding provider
|
2025-03-17 22:32:00 +00:00 |
|
perf3ct
|
84a8473beb
|
adapt or regenerate embeddings - allows users to decide
|
2025-03-17 21:47:11 +00:00 |
|
perf3ct
|
ebc5107b96
|
add missing options
|
2025-03-17 21:23:43 +00:00 |
|
perf3ct
|
37f1dcdaab
|
add ability to fetch available models from openai
|
2025-03-17 21:03:21 +00:00 |
|
perf3ct
|
c40c702761
|
add anthropic options as well
|
2025-03-17 20:17:28 +00:00 |
|
perf3ct
|
4a4eac6f25
|
Allow users to specify OpenAI embedding and chat models
|
2025-03-17 20:07:53 +00:00 |
|
perf3ct
|
d95fd0b049
|
allow specifying openai embedding models too
|
2025-03-17 19:54:11 +00:00 |
|
perf3ct
|
cc85b9a8f6
|
fix autoupdate name inconsistency
|
2025-03-16 20:55:55 +00:00 |
|
perf3ct
|
781a2506f0
|
fix embeddings w/ cls.init()
|
2025-03-16 18:55:53 +00:00 |
|
perf3ct
|
697d348286
|
set up more reasonable context window and dimension sizes
|
2025-03-16 18:08:50 +00:00 |
|
perf3ct
|
c556989f85
|
Merge branch 'develop' into ai-llm-integration
|
2025-03-15 19:38:27 +00:00 |
|
Elian Doran
|
0f28bbb1be
|
feat(server): use custom temporary directory within trilium-data
|
2025-03-15 10:11:54 +02:00 |
|
Jon Fuller
|
39d265a9fa
|
Merge branch 'develop' into ai-llm-integration
|
2025-03-12 11:58:30 -07:00 |
|
perf3ct
|
eaa947ef7c
|
"rebuild index" functionality for users
|
2025-03-12 00:08:39 +00:00 |
|
perf3ct
|
72b1426d94
|
break up large vector_store into smaller files
|
2025-03-12 00:02:02 +00:00 |
|
perf3ct
|
fc5599575c
|
allow users to manually request index to be rebuilt
|
2025-03-11 23:29:54 +00:00 |
|
perf3ct
|
730d123802
|
create llm index service
|
2025-03-11 23:26:47 +00:00 |
|
perf3ct
|
0d2858c7e9
|
upgrade chunking
|
2025-03-11 23:04:51 +00:00 |
|
perf3ct
|
6ce3f1c355
|
better note names to LLM?
|
2025-03-11 22:47:36 +00:00 |
|
Elian Doran
|
cf76358dd7
|
fix(canvas): font loading
|
2025-03-11 23:03:34 +02:00 |
|
perf3ct
|
f47b070f0f
|
I think this works to handle failed embeddings
|
2025-03-11 20:22:01 +00:00 |
|
perf3ct
|
56fc720ac7
|
undo accidental MAX_ALLOWED_FILE_SIZE_MB change
|
2025-03-11 17:31:26 +00:00 |
|
perf3ct
|
ff679b00b6
|
move providers to their own folder
|
2025-03-11 17:30:50 +00:00 |
|
perf3ct
|
d2dc401639
|
add these options as configurable
|
2025-03-11 03:58:39 +00:00 |
|
Jon Fuller
|
d713f3831a
|
Merge branch 'develop' into ai-llm-integration
|
2025-03-10 16:43:48 -07:00 |
|
perf3ct
|
c1585c73da
|
actually shows useful responses now
|
2025-03-10 05:06:33 +00:00 |
|
perf3ct
|
ef6ecdc42d
|
it errors, but works
|
2025-03-10 04:28:56 +00:00 |
|
perf3ct
|
cf0e9242a0
|
try a context approach
|
2025-03-10 03:34:48 +00:00 |
|
Elian Doran
|
e129e0369d
|
server(attachments): render empty SVGs properly (closes #1378)
|
2025-03-09 22:23:01 +02:00 |
|
perf3ct
|
adaac46fbf
|
I'm 100% going to have to destroy this commit later
|
2025-03-09 02:19:26 +00:00 |
|
perf3ct
|
0cd1be5568
|
Show embedding generation stats to user
|
2025-03-08 23:17:13 +00:00 |
|
perf3ct
|
0daa9e717f
|
I can create embeddings now?
|
2025-03-08 23:13:49 +00:00 |
|
perf3ct
|
6ace4d5692
|
nearly able to process embeddings
|
2025-03-08 23:08:25 +00:00 |
|
perf3ct
|
dc439b21b0
|
update schema with our new tables
|
2025-03-08 23:01:45 +00:00 |
|
perf3ct
|
553f7dd498
|
fix the Ollama embedding model setting option breaking
|
2025-03-08 22:28:14 +00:00 |
|
perf3ct
|
d3013c925e
|
add additional options for ollama embeddings
|
2025-03-08 22:23:50 +00:00 |
|
perf3ct
|
1361e4d438
|
set up embedding API endpoints
|
2025-03-08 22:04:10 +00:00 |
|
perf3ct
|
9f84a84f96
|
Merge branch 'develop' into ai-llm-integration
|
2025-03-08 20:51:57 +00:00 |
|