From 8214079093964b9aa21fb4ed9c4180963d13253e Mon Sep 17 00:00:00 2001 From: Kaloyan Nikolov Date: Thu, 26 Feb 2026 01:36:07 +0100 Subject: [PATCH] docs: update IDEA.md to mention TypeScript and pnpm --- IDEA.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/IDEA.md b/IDEA.md index 990fcc6..3d43b57 100644 --- a/IDEA.md +++ b/IDEA.md @@ -1,3 +1,3 @@ -Lightweigh app that i will run on android. Built using react (not react-native), capacitor for the wrapper and ill run android studio myself in the end for the actual apk generation. It is a chat app to chat with locally (or remotely) hosted llms. When the user enters, if there is no memory of the last connection, it asks for a api endpoint (openai-like). the user provides it and they have a smal top left button to open a drawer with all sessions (also buttons to add/remove sessions). The actual chat window will be very minimal. The theme will be dark/black. there should be a "save conversation" button that exports it as some form of file (.log?.txt?.md?). the communication with the llm api should be something simple, if it was a bash program, i'd literally just wrap user prompts in a curl template and extract the "content" from the output, for a react app you figure the get/post. oh, also an option to add custom system prompts / inject my own prompts. +Lightweigh app that i will run on android. Built using react (not react-native) with TypeScript, capacitor for the wrapper and ill run android studio myself in the end for the actual apk generation. It is a chat app to chat with locally (or remotely) hosted llms. When the user enters, if there is no memory of the last connection, it asks for a api endpoint (openai-like). the user provides it and they have a smal top left button to open a drawer with all sessions (also buttons to add/remove sessions). The actual chat window will be very minimal. The theme will be dark/black. there should be a "save conversation" button that exports it as some form of file (.log?.txt?.md?). the communication with the llm api should be something simple, if it was a bash program, i'd literally just wrap user prompts in a curl template and extract the "content" from the output, for a react app you figure the get/post. oh, also an option to add custom system prompts / inject my own prompts. -codebase rules: it needs to be simple. it must be modular. when making functions, think if there isn't a similar one that you can abstract for both usecases. it needs unit tests. +codebase rules: it needs to be simple. it must be modular. when making functions, think if there isn't a similar one that you can abstract for both usecases. it needs unit tests. uses pnpm for package management.