I try to run ollama with deepseek-r1:1.5b on:using the guide on https://robododd.com/how-to-run-the-dee ... i-locally/.
Unfortunately I got an error:Now I try again, but then it started downloading from scratch. I thought it would reuse downloaded files. I run the Raspberry Pi in headless mode (i.e. X or Wayland is not running) and I'm logged in via ssh. I've shut down BOINC and other clients/servers normally running on this computer.
Update: It worked the second time.
Code:
$ neofetch --offpi@BoincPi------------ OS: Debian GNU/Linux 12 (bookworm) aarch64 Host: Raspberry Pi 4 Model B Rev 1.1 Kernel: 6.6.62+rpt-rpi-v8 Uptime: 8 days, 21 hours, 54 mins Packages: 1646 (dpkg) Shell: bash 5.2.15 Resolution: 1280x1024 Terminal: /dev/pts/1 CPU: (4) @ 1.500GHz Memory: 168MiB / 3797MiB
Unfortunately I got an error:
Code:
Error: Post "http://127.0.0.1:11434/api/show": read tcp 127.0.0.1:57660->127.0.0.1:11434: read: connection reset by peer
Update: It worked the second time.
Code:
pi@BoincPi:~ $ curl -fsSL https://ollama.com/install.sh | sh>>> Installing ollama to /usr/local>>> Downloading Linux arm64 bundle######################################################################## 100.0%>>> Creating ollama user...>>> Adding ollama user to render group...>>> Adding ollama user to video group...>>> Adding current user to ollama group...>>> Creating ollama systemd service...>>> Enabling and starting ollama service...Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.>>> The Ollama API is now available at 127.0.0.1:11434.>>> Install complete. Run "ollama" from the command line.WARNING: No NVIDIA/AMD GPU detected. Ollama will run in CPU-only mode.pi@BoincPi:~ $ ollama --versionollama version is 0.5.7pi@BoincPi:~ $ ollama run deepseek-r1:1.5bpulling manifest pulling aabd4debf0c8... 100% ▕████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 1.1 GB Error: Post "http://127.0.0.1:11434/api/show": read tcp 127.0.0.1:57660->127.0.0.1:11434: read: connection reset by peerpi@BoincPi:~ $ ollama run deepseek-r1:1.5bpulling manifest pulling aabd4debf0c8... 100% ▕████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 1.1 GB pulling 369ca498f347... 100% ▕████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 387 B pulling 6e4c38e1172f... 100% ▕████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 1.1 KB pulling f4d24e9138dd... 100% ▕████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 148 B pulling a85fe2a2e58e... 100% ▕████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 487 B verifying sha256 digest writing manifest success >>> What a time to be alive. An AI chatbot running locally! What do you think?<think>Okay, so I'm trying to figure out what the user is asking here. They mentioned "What a time to be alive." That's pretty catchy and makes me imagine being in a good mood or feeling happy. Then they started talking about an AI chatbot running locally. Hmm, that's interesting because I know AI chatbots can do a lot of things these days, but the idea of them running locally sounds different from what I usually see.I think "locally" might mean that instead of using the internet or cloud services to run, this AI chatbot would be built within its own environment. That could make it more accessible because you don't need a network or a big server set up on your computer. It's like having an AI bot that works well in their living space without any extra hardware.I'm wondering if the AI is designed to handle specific tasks. Maybe it's useful for personal help, like answering questions, giving advice, or even controlling things automatically. Or perhaps it's part of a system where multiple chatbots work together, making it more powerful and versatile than a single one could be.But I'm not sure about the capabilities beyond what an AI chatbot does on its own. I should consider the potential benefits: maybe increased productivity for small tasks or easier access to information without needing complex setups. Also, if it's built locally, it might be cheaper to run and more convenient.I wonder how people use this AI chatbot now. If they're talking about a personal AI assistant, would it have features like voice assistants or live chat? Or is it just a static chatbot that doesn't interact with users? I think it's the latter because "running locally" might imply something that can engage with users in real-time.Another thought: the AI could be integrated with other technologies. For example, if you want to automate certain tasks using this AI chatbot, maybe it can handle not just text but also commands or even make decisions on your behalf. That would make it more versatile and useful for a range of applications beyond just chatting.I'm also thinking about the challenges involved. Building an AI bot that runs locally might require setting up its own database, servers, or cloud services. It could be time-consuming if you're not tech-savvy but probably manageable with some experience. Plus, managing multiple chatbots might mean dealing with maintenance and updates for each one.In terms of applications, this AI chatbot could be used in various ways: as a mobile assistant during the day, as part of an emergency response system, or even as a tool for remote learning where students can interact directly with a virtual TA. It would be handy to have quick answers or help without needing to search through logs or deal with complicated interfaces.But wait, there are also potential downsides. If it's too specialized or doesn't handle common tasks well, it might not feel as helpful as someone who can chat in real-time. Also, if the AI isn't trained properly, it could make decisions that aren't accurate or fair. That's something to consider when integrating such a system into daily use.Overall, I think the user is talking about an AI bot that runs on its own hardware, which offers flexibility and ease of use compared to cloud-based solutions. It can be integrated into various applications and settings, making it a useful tool for enhancing interactions without needing complex setups or internet access.</think>The AI chatbot you're referring to is designed to operate locally, meaning it functions within the user's own environment, potentially using local hardware and services. This setup offers flexibility and convenience, allowing users to engage with it in real-time without needing network access. Such a chatbot can handle tasks like answering questions, providing advice, or controlling automatic systems, making it versatile for various applications. While specialized handling might limit its effectiveness compared to real-time interactions, the bot's integration into daily use makes it adaptable and user-friendly across different settings and environments.>>> Send a message (/? for help)
Statistics: Posted by mob-i-l — Wed Jan 29, 2025 3:06 am