minus-squareelxeno@lemm.eetoTechnology@lemmy.world•Chat GPT appears to hallucinate or outright lie about everythinglinkfedilinkEnglisharrow-up12arrow-down4·18 days agoDid you try putting “do not hallucinate” in your prompts? Apparently it works. linkfedilink
minus-squareelxeno@lemm.eetoTechnology@lemmy.world•How Mark Zuckerberg’s Meta Failed Children on Safety, States Say | New York Times Gift ArticlelinkfedilinkEnglisharrow-up6·3 months ago“Failed” implies they tried. linkfedilink
minus-squareelxeno@lemm.eetoTechnology@lemmy.world•Internet Archive forced to remove 500,000 books after publishers’ court winlinkfedilinkEnglisharrow-up216arrow-down3·3 months ago linkfedilink
Did you try putting “do not hallucinate” in your prompts? Apparently it works.