Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Issue #2603] Setup triggers on our opportunity tables which populate the search queue table #2611

Merged
merged 7 commits into from
Oct 29, 2024

Conversation

mikehgrantsgov
Copy link
Collaborator

@mikehgrantsgov mikehgrantsgov commented Oct 28, 2024

Summary

Fixes #2603

Time to review: 15 mins

Changes proposed

Add migration which adds DB triggers to populate opportunity_search_index_queue based on updates made to existing tables:
opportunity
opportunity_assistance_listing
current_opportunity_summary
opportunity_summary
link_opportunity_summary_funding_instrument
link_opportunity_summary_funding_category
link_opportunity_summary_applicant_type
opportunity_attachment

Context for reviewers

See test SQL below. Created migration will add or update entries in the opportunity_search_index_queue table until a subsequent process handles them.

Additional information

See attached SQL file for running tests based on these changes:

-- Start transaction for all tests
BEGIN;

-- Test 1: Basic opportunity insert
INSERT INTO api.opportunity (opportunity_id, opportunity_title, is_draft)
VALUES (99999, 'Test Opportunity', false);

-- Verify queue entry was created
SELECT EXISTS (
    SELECT 1 FROM api.opportunity_search_index_queue 
    WHERE opportunity_id = 99999 AND has_update = true
) as "Test 1: Queue entry created for new opportunity";

-- Test 2: Multiple related inserts in single transaction
INSERT INTO api.opportunity (opportunity_id, opportunity_title, is_draft)
VALUES (99998, 'Test Multi-Update Opportunity', false);

INSERT INTO api.opportunity_summary (
    opportunity_summary_id, 
    opportunity_id, 
    summary_description,
    is_forecast
) VALUES (99998, 99998, 'Test Summary', false);

INSERT INTO api.current_opportunity_summary (
    opportunity_id,
    opportunity_summary_id,
    opportunity_status_id
) VALUES (99998, 99998, 1);

INSERT INTO api.link_opportunity_summary_funding_instrument (
    opportunity_summary_id,
    funding_instrument_id
) VALUES (99998, 1);

INSERT INTO api.opportunity_attachment (
    attachment_id,
    opportunity_id,
    opportunity_attachment_type_id,
    file_location,
    mime_type,
    file_name,
    file_description,
    file_size_bytes
) VALUES (
    99998,
    99998,
    1,
    'test/location',
    'text/plain',
    'test.txt',
    'Test file',
    100
);

-- Verify only one queue entry exists for multiple updates
SELECT 
    (SELECT COUNT(*) FROM api.opportunity_search_index_queue WHERE opportunity_id = 99998) = 1 
    as "Test 2: Single queue entry for multiple updates";

-- Test 3: Update existing record
UPDATE api.opportunity 
SET opportunity_title = 'Updated Title' 
WHERE opportunity_id = 99999;

-- Verify has_update is still true
SELECT has_update 
FROM api.opportunity_search_index_queue 
WHERE opportunity_id = 99999 
as "Test 3: has_update still true after update";

-- Test 4: Link table triggers
INSERT INTO api.opportunity_summary (
    opportunity_summary_id, 
    opportunity_id, 
    summary_description,
    is_forecast
) VALUES (99999, 99999, 'Another Test Summary', false);

INSERT INTO api.link_opportunity_summary_funding_instrument (
    opportunity_summary_id,
    funding_instrument_id
) VALUES (99999, 1);

-- Verify queue entry still exists and has_update is true
SELECT EXISTS (
    SELECT 1 FROM api.opportunity_search_index_queue 
    WHERE opportunity_id = 99999 AND has_update = true
) as "Test 4: Queue entry exists after link table insert";

-- Test 5: Verify timestamps are updating
UPDATE api.opportunity 
SET opportunity_title = 'Another Update' 
WHERE opportunity_id = 99999;

SELECT 
    updated_at > created_at 
FROM api.opportunity_search_index_queue 
WHERE opportunity_id = 99999 
as "Test 5: Updated timestamp is newer than created";

-- Output all test data for manual verification
SELECT 'Final Queue State' as description;
SELECT * FROM api.opportunity_search_index_queue WHERE opportunity_id IN (99999, 99998);

-- Cleanup
ROLLBACK;

Copy link
Collaborator

@chouinar chouinar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something I didn't catch on your last PR - but that we need before we can merge this.

We need the Opportunity ORM model to be aware of its record in the queue table otherwise we can't delete it.

You can see this if you run make console and then try to do:

x = dbs.query(Opportunity).first()
dbs.delete(x)
dbs.commit()

which errors with:

sqlalchemy.exc.IntegrityError: (psycopg.errors.ForeignKeyViolation) update or delete on table "opportunity" violates foreign key constraint "opportunity_search_index_queue_opportunity_id_opportunity_fkey" on table "opportunity_search_index_queue"

We solve this with other foreign keys with the cascade="all, delete-orphan" on the relationship.

Alternatively.. I suppose there isn't anything stopping us from making this whole "has_update" column just exist in the opportunity table itself. Thoughts?

@mikehgrantsgov
Copy link
Collaborator Author

I kind of like the idea of a separate table to handle the concern of the updates, which can expand if we need to store other info to store into the search queue later on. For this reason I think going the cascade="all, delete-orphan" route makes sense here.

Something I didn't catch on your last PR - but that we need before we can merge this.

We need the Opportunity ORM model to be aware of its record in the queue table otherwise we can't delete it.

You can see this if you run make console and then try to do:

x = dbs.query(Opportunity).first()
dbs.delete(x)
dbs.commit()

which errors with:

sqlalchemy.exc.IntegrityError: (psycopg.errors.ForeignKeyViolation) update or delete on table "opportunity" violates foreign key constraint "opportunity_search_index_queue_opportunity_id_opportunity_fkey" on table "opportunity_search_index_queue"

We solve this with other foreign keys with the cascade="all, delete-orphan" on the relationship.

Alternatively.. I suppose there isn't anything stopping us from making this whole "has_update" column just exist in the opportunity table itself. Thoughts?

Comment on lines 438 to 440
opportunity: Mapped[Opportunity] = relationship(
Opportunity, cascade="all, delete-orphan", single_parent=True
)
Copy link
Collaborator

@chouinar chouinar Oct 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You would need this relationship to go the other way. Doing this would delete the opportunity if you deleted the queue record.

Basically the same as current_opportunity_summary in the opportunity model:


    current_opportunity_summary: Mapped["CurrentOpportunitySummary | None"] = relationship(
        back_populates="opportunity", single_parent=True, cascade="all, delete-orphan"
    )

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, good call. I thought the single_parent property would have done this on the queue. Updated this.

Copy link
Collaborator

@chouinar chouinar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@mikehgrantsgov mikehgrantsgov merged commit a12f926 into main Oct 29, 2024
8 checks passed
@mikehgrantsgov mikehgrantsgov deleted the mikehgrantsgov/2603-opportunity-table-triggers branch October 29, 2024 19:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Setup triggers on our opportunity tables which populate the search queue table
3 participants