This repository was archived by the owner on Sep 9, 2025. It is now read-only.
File tree Expand file tree Collapse file tree 1 file changed +30
-0
lines changed
Expand file tree Collapse file tree 1 file changed +30
-0
lines changed Original file line number Diff line number Diff line change 1+ # InstructLab macOS App
2+
3+ ## Scope
4+
5+ This document is targeted for macOS applications, but the idea could easily be transferred to other operating systems.
6+
7+ ## Problem statement
8+
9+ Starting InstructLab on your local laptop is hard. It requires a significant amount of ` python ` knowledge and terminal
10+ work, that is unrealistic for a non technologist to use. Having to install ` git ` specific versions of ` python ` and
11+ ` xcode ` requires a level of expertise that will create barriers of adoption to the InstructLab project.
12+
13+ ## Proposed solution
14+
15+ [ ollama] [ ollama ] has a macOS application that is a double-click installation for their server to run the commands
16+ locally. We propose creating the same "system bar" application, with the ability to run ` ilab model serve ` in the background
17+ and a possible way to do ` ilab model chat ` from said application.
18+
19+ Having the ` ilab ` dog up in the system bar telling you that ` ilab model serve ` is running, could open up the opportunity
20+ to have a model open to ask a quick question to the local model, and even an ability to open up a "long-running"
21+ conversation via a web browser or the like.
22+
23+ ## Next steps
24+
25+ 1 . Create a simple MVP of starting the ` ilab model serve ` application, with controls for the ` serve ` options, including
26+ what model you'd like to run, i.e. Granite or Merlinite.
27+ 2 . Create an option to ask a quick question (` -qq ` option) to the via the drop-down
28+ 3 . Create a ` ilab model chat ` type interface via a window or web browser.
29+
30+ [ ollama ] : https://ollama.com/download/mac
You can’t perform that action at this time.
0 commit comments