In Django, it’s often useful to trigger certain actions when a model is saved. While Django provides built-in signals like post_save, there are cases where you might want to define custom behavior for all models inheriting from a common abstract base model. This article will show you how to implement this pattern, include logging, and test it effectively.

The Problem

Imagine you have an abstract base model that defines common fields and behaviors for multiple models in your Django application. You want to trigger a specific action whenever any of the child models are saved, without having to implement this logic in each child model separately. Additionally, you want to log these actions for debugging and monitoring purposes.

The Solution

We’ll create an abstract BaseModel that includes a custom save method and a post_save signal handler. This will allow us to define behavior that applies to all models inheriting from BaseModel, including logging functionality.

Step 1: Define the BaseModel

First, let’s define our BaseModel in models.py:

# project/library/models.py
import logging

from django.db import models

logger = logging.getLogger(__name__)


class BaseModel(models.Model):
    created_at = models.DateTimeField(auto_now_add=True)
    updated_at = models.DateTimeField(auto_now=True)

    class Meta:
        abstract = True

    def save(self, *args, **kwargs):
        # Custom save logic here
        super().save(*args, **kwargs)

    def on_post_save(self):
        # Custom post_save logic here
        # This method will be called after the model is saved
        logger.info(f"{self.__class__.__name__} instance saved: {self.pk}")

Step 2: Create a Child Model

Now, let’s create a child model that inherits from BaseModel:

# project/library/models.py


class Book(BaseModel):
    title = models.CharField(max_length=200)
    author = models.ForeignKey("Author", on_delete=models.CASCADE)

    class Meta:
        verbose_name = "Book"
        verbose_name_plural = "Books"

    def __str__(self)
        return f"{self.title} - {self.author}"

Step 3: Implement the Signal

Create a new file called signals.py in your app directory:

# project/library/signals.py
from django.db.models.signals import post_save
from django.dispatch import receiver

from project.library.models import BaseModel


@receiver(post_save)
def basemodel_on_post_save(sender, instance, **kwargs):
    if issubclass(sender, BaseModel):
         instance.on_post_save()

Step 4: Connect the Signal

In your app’s apps.py file, connect the signal:

# project/library/apps.py
from django.apps import AppConfig

class LibraryConfig(AppConfig):
    name = "library"

    def ready(self):
        import library.signals

Testing the Implementation

Now, let’s write some tests to ensure our implementation works correctly, including the new logging functionality. We’ll use pytest, pytest-django, and model-bakery for this.

Create a new file called test_models.py in your tests directory:

# tests/integration/library/test_models.py
from unittest import mock

import pytest
from model_bakery import baker

from project.library.models import Book, Author

pytestmark = pytest.mark.django_db


class TestBaseModel:
    def test_on_post_save_called(self):
        mock_on_post_save = mock.patch.object(Book, "on_post_save")

        author = baker.make(Author)
        book = baker.make(Book, author=author)

        mock_on_post_save.assert_called_once()

    def test_created_at_and_updated_at(self):
        author = baker.make(Author)
        book = baker.make(Book, author=author)

        assert book.created_at is not None
        assert book.updated_at is not None

    def test_update_changes_updated_at(self):
        # Create a book instance
        author = baker.make(Author)
        book = baker.make(Book, author=author)

        # Store the original updated_at value
        original_updated_at = book.updated_at

        # Update the book
        book.title = "New Title"
        book.save()

        assert book.updated_at > original_updated_at

    def test_logging_on_post_save(self, caplog):
        author = baker.make(Author)
        book = baker.make(Book, author=author)

        assert f"Book instance saved: {book.pk}" in caplog.text

    def test_logging_on_update(self, caplog):
        # Create a book instance
        author = baker.make(Author)
        book = baker.make(Book, author=author)

        caplog.clear()

        # Update the book
        book.title = "Updated Title"
        book.save()

        assert f"Book instance saved: {book.pk}" in caplog.text

To run these tests, you’ll need to install the required packages:

$ pip install pytest pytest-django model-bakery

Then, you can run the tests using:

$ pytest

Conclusion

By implementing a custom save method and post_save signal handler in an abstract base model, we’ve created a reusable pattern for triggering actions and logging when any child model is saved. This approach keeps our code DRY and makes it easy to add common functionality to multiple models.

The addition of logging provides valuable insights into model operations, which can be crucial for debugging and monitoring your application’s behavior.

Remember to always test your implementations thoroughly, as we’ve done here using pytest, model-bakery, and pytest’s caplog fixture. This ensures that your signals, model methods, and logging behave as expected across different scenarios.

When implementing logging in your Django project, don’t forget to configure your logging settings in your project’s settings file to ensure that logs are captured and stored appropriately for your production environment.