swiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-219 days agoLavalamp too hotdiscuss.tchncs.deimagemessage-square71linkfedilinkarrow-up1469
arrow-up1469imageLavalamp too hotdiscuss.tchncs.deswiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-219 days agomessage-square71linkfedilink
minus-squaredream_weasel@sh.itjust.workslinkfedilinkarrow-up4·18 days agoThis kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.
This kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.