I'm using CQRS in all my projects. It enables me to have processes that are very easy to understand and are independent from each other.
As every endpoint has separate read models, it's also possible to improve a single endpoint without affecting any other. One possible improvement is to use a native query and construct the read model from the query result instead of pulling doctrine entities and converting them into read models "manually".
When doing so, there are two stumbling blocks:
Mapping of query result data to read models.
Decoding of query result data (as it's all strings).
Mapping of query result data to read models.
There are multiple mapping solutions for PHP, but most of them need some kind of custom mapping for each read model. Instead of this, I'm using the Symfony serializer. It's already in use for the conversion of API requests and responses. Additionally I have custom normalizers that handle a varity of value objects. Why not use the same for the conversion from a query result to a read model?
I'm just wrapping the Symfony denormalizer into a class that handles the type for an array and use template annotations to add more typing. But even that's optional.
<?php
declare(strict_types=1);
namespace App\Serializer;
use Symfony\Component\Serializer\Normalizer\DenormalizerInterface;
final readonly class Denormalizer
{
public function __construct(
private DenormalizerInterface $denormalizer,
) {
}
/**
* @template T of object
*
* @param class-string<T> $class
*
* @return array<int, T>
*/
public function denormalizeArray(
array $data,
string $class,
): array {
/** @var array<int, T> */
return $this->denormalizer->denormalize(
data: $data,
type: self::arrayOfClass($class),
);
}
public static function arrayOfClass(string $class): string
{
return sprintf('%s[]', $class);
}
}
Decoding of query result data
When using Connection->fetchAllAssociative(), the query result will be an associative array with all colums, but all values (except booleans and integers) will be strings due to the underlying pg_fetch_assoc.
We can solve that by wrapping the connection into a class that directly does the decoding depending on a specific configuration for each query. This way we can convert a JSON string into an associative array that can then be directly denormalized to a read model.
Putting it together
Using those pieces, we could get the following result:
The read models would be the same without this concept. Instead of the data provider that constructs the read models, the query handler would trigger multiple queries through doctrine to pull all entities and run counts and loop through addresses instead. SQL is just better suited to such tasks and as a side effect is way more performant.