• 0x01@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    llama is good and I’m looking forward to trying deepseek 3, but the big issue is that those are the frontier open source models while 4o is no longer openai’s best performing model, they just dropped o3 (god they are literally as bad as microsoft at naming) which shows in benchmarks tremendous progress in reasoning

    When running llama locally I appreciate the matched capabilities like structured output, but it is objectively significantly worse than openai’s models. I would like to support open source models and use them exclusively but dang it’s hard to give up the results

    I suppose one way to start for me would be dropping cursor and copilot in favor of their open source equivalents, but switching my business to use llama is a hard pill to swallow