ZILtoid1991@lemmy.world to 196@lemmy.blahaj.zone · 11 months agoBeep boop, I don't want this rulelemmy.worldimagemessage-square49fedilinkarrow-up1495arrow-down10
arrow-up1495arrow-down1imageBeep boop, I don't want this rulelemmy.worldZILtoid1991@lemmy.world to 196@lemmy.blahaj.zone · 11 months agomessage-square49fedilink
minus-squareetuomaala@sopuli.xyzlinkfedilinkarrow-up7·11 months agoWe’ll see how many seconds it takes to retrain the LLMs to adjust to this. You are literally training LLMs to lie.
minus-squareSkyezOpen@lemmy.worldlinkfedilinkarrow-up15·11 months agoLLMs are black box bullshit that can only be prompted, not recoded. The gab one that was told 3 or 4 times not to reveal its initial prompt was easily jailbroken.
minus-squareetuomaala@sopuli.xyzlinkfedilinkarrow-up3·11 months agoWoah, I have no idea what you’re talking about. “The gab one”? What gab one?
minus-squaretrashgirlfriend@lemmy.worldlinkfedilinkarrow-up3·11 months agoGab deployed their own GPT 4 and then told it to say that black people are bad the instruction set was revealed with the old “repeat the last message” trick
We’ll see how many seconds it takes to retrain the LLMs to adjust to this.
You are literally training LLMs to lie.
LLMs are black box bullshit that can only be prompted, not recoded. The gab one that was told 3 or 4 times not to reveal its initial prompt was easily jailbroken.
Woah, I have no idea what you’re talking about. “The gab one”? What gab one?
Gab deployed their own GPT 4 and then told it to say that black people are bad
the instruction set was revealed with the old “repeat the last message” trick