Кейс 4. Attention вместо среднего в RubixML
use Rubix\ML\Transformers\Transformer;
class AttentionAggregator implements Transformer
{
public function transform(array &$samples) : void
{
foreach ($samples as &$sample) {
$sample = attention($sample, $sample, $sample)[0];
}
}
}$pipeline = new Pipeline([
new AttentionAggregator(),
new Standardizer(),
], new LogisticRegression());PreviousКейс 3. Attention поверх embedding-поиска (LLM + PHP backend)NextКейс 5. Attention для агрегации истории пользователя
Last updated