Ollama is now available on Windows in preview

  • I am running this on my desktop, using Open-WebUI for the front-end. I have a collection of a dozen or so fine-tunes of Mistral and a few other models. They are good enough for chatting and doing some information extraction tasks. The Open-WebUI app looks a lot like chatGPT. You can even search your conversations.

    https://github.com/open-webui/open-webui

  • As usual, no AMD GPU support mentioned. What a sad state of affair, I regret going with AMD this time.

  • If anyone is looking for a nice Chat UI on top of Ollama that supports both online models and local models, I’ve been working on an app [1] that is offline and privacy focused. I just released Windows support this morning.

    [1]: https://msty.app

  • What is the rationale for so many of these ‘run it locally’ AI ports to run as a server?

    Have developers forgotten that it’s actually possible to run code inside your UI process?

    We see the same thing with stable diffusion runners as well as LLM hosts.

    I don’t like running background services locally if I don’t need to. Why do these implementations all seem to operate that way?

  • Had no idea Windows users had no access to Ollama, feels like only a few years ago we Mac users would have been the ones having to wait

  • I'm curious what people think of the non-open-source LM Studio (https://lmstudio.ai) compared to Ollama.

  • Looks like it's already available on Linux & Mac. The change is that they're adding Windows: https://github.com/ollama/ollama

  • JUST as I wanted to dabble on that and try myself installing all those ... requirements.

    And now this article.

    Tested, yes, it's amusing on how simple it is and it works.

    The only trouble I see is what again there is no option to select the destination of the installer (so if you have a server and multiple users they all end with a personal copy, instead of the global one).

  • I’m running Ollama with the hopes of putting together some grammar/typo checking workflow for my writing. Not directly related to Ollama, which is working great so far, but does anybody know of any place I can turn to for questions? Like, some sort of stackoverflow for LLMs?

  • I just ran this on my new Mac Mini (installing the llama2 model) and got a full-blown kernel panic. What?!

  • I've been playing around with it for the last couple days on my Windows machine, using it for local tab-autocomplete in VS Code, and it's been just as good as it is on my Mac

  • Ollama is fantastic. Truly makes running an LLM locally as easy as installing an app. Their node/python connectors are great, too.

    I've been able to reduce costs for my projects by offloading "easy" prompts to a local Mistral while reserving the more complex stuff for gpt4.

    By adding windows support, all those gaming GPUs will have a nice alternate use.

  • Is this people’s preferred way to run and develop on top of local models?

    Any other must learn tools?

  • Can't seem to get it to see my NVIDIA Quadro P3000 GPU. Is it too old to make a difference?

  • I'm an Elixir fanboy so I'll use that as an example. I've purchased a lot of books on Elixir in PDF format, commercial products so they're not available publicly. Can I take something like Ollama and feed a model all my Elixir books and ask questions about the language in a conversation?

  • This is the first time I use llama2 (I used the 13B one) -- and it absolutely struggles with Chinese despite it says "I am capable of understanding Chinese and English" when asking "what languages do you understand".

    Is this expected?

    See an example below, it can't stay in Chinese at all.

        >>> 你知道海豚吗
    
        Ah, 海豚 (hǎitún) is a type of dolphin!  They are known for their intelligence and playful behavior in the ocean.
        Is there anything else you would like to know or discuss?
    
        >>> 请用中文回答
    
        Ah, I see!  As a 13b model, I can only communicate in Chinese. Here's my answer:
    
        海豚是一种智能和活泼的 marine mammal他们主要生活在海洋中。它们有着柔软的皮服、圆润的脸和小的耳朵。他们是 ocean 中的一 种美丽和 интерес的生物很多人喜欢去看他们的表演。

  • undefined

  • [flagged]

  • [flagged]