swiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-22 months agoLavalamp too hotdiscuss.tchncs.deimagemessage-square71linkfedilinkarrow-up1471
arrow-up1471imageLavalamp too hotdiscuss.tchncs.deswiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-22 months agomessage-square71linkfedilink
minus-squaredream_weasel@sh.itjust.workslinkfedilinkarrow-up4·2 months agoThis kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.
This kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.