Performance compared to native validation #875
-
Native Laravel validation performance is well-known to be slow due to I was exploring this package hoping that it would be more performant but I get these results: class IdData extends \Spatie\LaravelData\Data
{
public function __construct(
public int $id,
) {
}
}
class ArrayData extends \Spatie\LaravelData\Data
{
public function __construct(
/** @var IdData[] */
public array $array,
) {
}
}
$array = array_map(fn (int $id) => ['id' => $id], range(1, 5_000));
\Illuminate\Support\Benchmark::dd([
fn () => \Illuminate\Support\Facades\Validator::validate(['array' => $array,], [ 'array.*.id' => 'required|integer', ]),
fn () => ArrayData::validate(['array' => $array])
]); array:2 [
0 => "188.543ms"
1 => "13,456.497ms"
] Am I doing something wrong? (And, needless to say, but thank you so much for your contribution to the ecosystem ❤️ ) |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Placing these classes in |
Beta Was this translation helpful? Give feedback.
-
I think as a rule of thumb, never use data validation for really big collections 😅 The thing is that we use NestedRules for each data object and those are tailored for every data object since one can always change depending on the values provided (see the rules method). Technically we could add some smart caching here and there which could speed up the process (when there's no rules method we kinda know the rules will probably be the same for every object). But adding such a thing is a lot of work and requires really good testing in order to break nothing, something on my long list for data v5 but that will probably take some months to even get started on. |
Beta Was this translation helpful? Give feedback.
In Flare we've created a validator per item that was surprisingly faster than expected, otherwise manually writing out rules in PHP or searching for another library?