Dark Mode

Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

feat(localcowork): Ubuntu support with ROCm GPU auto-detection#76

Open
ThomasGmeinder wants to merge 3 commits intoLiquid4All:mainfrom
ThomasGmeinder:localcowork_ubuntu
Open

feat(localcowork): Ubuntu support with ROCm GPU auto-detection#76
ThomasGmeinder wants to merge 3 commits intoLiquid4All:mainfrom
ThomasGmeinder:localcowork_ubuntu

Conversation

Copy link
Contributor

ThomasGmeinder commented Mar 14, 2026

Summary

  • Ubuntu support: setup-dev.sh now detects the OS (macOS or Ubuntu/Debian) and automatically installs all system prerequisites (Node.js, Python venv, Rust, cmake, Tauri GTK/WebKit deps, inotify watcher limit)
  • Upgraded model from Preview to release: Updated all references from the gated LFM2-24B-A2B-Preview to the public LFM2-24B-A2B-GGUF repo. Resolves Cannot request access to LFM2-24B-A2B-Preview #75
  • GPU-accelerated llama.cpp build: Added a Makefile that builds llama-server from source with automatic ROCm GPU detection via rocm-smi, falling back to CPU. Supports make CPU=1 override
  • Improved usability: setup-dev.sh handles all prerequisites automatically -- the README Prerequisites section is condensed to a single ./scripts/setup-dev.sh call instead of manual platform-specific steps

Test report

The following agentic functions were tested in the browser on Ubuntu 24.04 with ROCm 7.2 (AMD Ryzen iGPU gfx1151, 48 GB VRAM):

  • "Tell me about my system" -- system info retrieved successfully
  • "Scan for leaked secrets" -- security scan completed
  • "Find personal data" -- PII detection completed
  • "Summarize PDF on desktop" -- a concise summary was generated in .pdf, .txt and .docx formats

Made with Cursor

Thomas Gmeinder added 3 commits March 14, 2026 13:37
- Add Ubuntu/Debian prerequisites section to README (Node.js, python3-venv,
Rust, Tauri GTK/WebKit deps, inotify watcher limit)
- Update model references from gated LFM2-24B-A2B-Preview to public
LFM2-24B-A2B-GGUF repo
- Fix --flash-attn flag syntax for newer llama.cpp in start-model.sh
- Add python3-venv check to setup-dev.sh

Tested on Ubuntu 24.04 with ROCm 7.2 (gfx1151). The llama-server binary
was symlinked from an existing ROCm/HIP build, not built by this project.
Tauri app launches and onboarding works on the NUC's local browser.
Note: Tauri (WebKit2GTK) requires a native display -- it does not work
over SSH remote without X11 forwarding or VNC.

Made-with: Cursor
...detection

Add a Makefile that clones and builds llama.cpp from source with automatic
ROCm GPU detection via rocm-smi. Falls back to CPU-only build when ROCm
is not available. Supports CPU=1 override to force CPU build.

start-model.sh now auto-finds ./llama-server built by make, falling back
to PATH. Updated README with Ubuntu prerequisites and build instructions.

Tested on Ubuntu 24.04 with ROCm 7.2 (gfx1151), AMD Ryzen iGPU (48 GB VRAM).

Test report (LFM2-24B-A2B-Q4_K_M on ROCm):
- "Tell me about my system" -- system info retrieved successfully
- "Scan for leaked secrets" -- security scan completed
- "Find personal data" -- PII detection completed
- "Summarize PDF on desktop" -- concise summary generated in .pdf, .txt and .docx formats

Made-with: Cursor
Simplify the getting-started experience: setup-dev.sh now detects the OS
(macOS or Ubuntu/Debian) and automatically installs all system
prerequisites -- Node.js, Python venv, Rust, cmake, Tauri GTK/WebKit
deps, and inotify watcher limit. Users no longer need to manually follow
platform-specific prerequisite steps from the README.

Condensed the README Prerequisites section to reflect this -- the manual
install commands are replaced by a single ./scripts/setup-dev.sh call.

Made-with: Cursor
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Reviewers

No reviews

Assignees

No one assigned

Labels

None yet

Projects

None yet

Milestone

No milestone

Development

Successfully merging this pull request may close these issues.

Cannot request access to LFM2-24B-A2B-Preview

1 participant