Loading...
Loading...
Google's AI-focused phone. 8GB RAM runs Nanobot in Termux well. On-device ML via Tensor G3 is a bonus.
Verified benchmark results from the OpenClaw team.
Termux + proot-distro. 8GB handles it but battery drain is significant.
Automated performance benchmarks from Docker-constrained environments.
Real experiences from users who tested these forks.
No community reports yet. Be the first to share your experience!
No comments yet. Be the first!
Nanobot in Termux runs well. 8GB and Tensor G3 make it snappy.
5s cold start