Track_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 22 hours agoHexadecimalslrpnk.netexternal-linkmessage-square59fedilinkarrow-up1754arrow-down119
arrow-up1735arrow-down1external-linkHexadecimalslrpnk.netTrack_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 22 hours agomessage-square59fedilink
minus-squaremorrowind@lemmy.mllinkfedilinkarrow-up1·12 hours agoNot really a concern. It’s basically translation, which language models excel at. It just needs a mapping of the hex to byte
minus-squareGissaMittJobb@lemmy.mllinkfedilinkarrow-up1·11 hours agoIt is a concern. Check out https://tiktokenizer.vercel.app/?model=deepseek-ai%2FDeepSeek-R1 and try entering some freeform hexadecimal data - you’ll notice that it does not cleanly segment the hexadecimal numbers into individual tokens.
minus-squaremorrowind@lemmy.mllinkfedilinkarrow-up1·10 hours agoI’m well aware, but you don’t need to necessarily see each character to translate to bytes
minus-squareGissaMittJobb@lemmy.mllinkfedilinkarrow-up1·10 hours agoIt’s not out of the question that we get emergent behaviour where the model can connect non-optimally mapped tokens and still translate them correctly, yeah.
Not really a concern. It’s basically translation, which language models excel at. It just needs a mapping of the hex to byte
It is a concern.
Check out https://tiktokenizer.vercel.app/?model=deepseek-ai%2FDeepSeek-R1 and try entering some freeform hexadecimal data - you’ll notice that it does not cleanly segment the hexadecimal numbers into individual tokens.
I’m well aware, but you don’t need to necessarily see each character to translate to bytes
It’s not out of the question that we get emergent behaviour where the model can connect non-optimally mapped tokens and still translate them correctly, yeah.