Coding using codeqwen is interesting, sometimes it works really well and others it requires a lot of modifications. I installed ollama as a local Windows app and downloaded the codeqwen llm. But running it from command line is a huge pain.

I then went looking for a local web GUI for ollama, and wasn’t able to find anything that didn’t require dependencies or a web server to be running.

I wanted something that would not need a proxy, not require node.js, and would just run after being unzipped. I found a similar implementation on github, removed all of the CORS and other external dependencies, added the ability to upload files, and let the llm know that the files uploaded were to be used to answer my questions.

I modifed the GUI a little bit, and it works great. I’ll upload it to GitHub as soon as possible.