-
Notifications
You must be signed in to change notification settings - Fork 755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Video endpoint bid selection enhancements #1419
Video endpoint bid selection enhancements #1419
Conversation
exchange/auction_test.go
Outdated
t.Errorf("%s: [CACHE_ERROR] More elements were cached than expected \n", fileDisplayName) | ||
} else { // len(specData.ExpectedCacheables) == len(cache.items) | ||
} else if len(specData.ExpectedCacheables) == len(cache.items) && isVideo { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you can just leave this as an 'else' here (leaving the condition commented out like it was before). In the event that isVideo and more bids than expected were cached it looks to me like the below code will still work (which just checks that all of the expected cachables were indeed cached, and that should still be the case).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right, fixed.
exchange/targeting.go
Outdated
for impId, topBidsPerImp := range auc.winningBidsByBidder { | ||
overallWinner := auc.winningBids[impId] | ||
for bidderName, topBidPerBidder := range topBidsPerImp { | ||
isOverallWinner := overallWinner == topBidPerBidder | ||
isOverallWinner := overallWinner == topBidPerBidder || isVideo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Won't setting all bids as winning bids cause key collisions between bids from different bidders? Why is setting includeBidderKeys
to true not a solution to the problem?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@hhhjort Hi Hans, just tested it with includeBidderKeys = true
. In this case every parameter in targeting data has _biddername
at the end, for example: hb_pb_appnexus -> "12.00"
, hb_uuid_appnexus -> "d81d1315-6060-4c42-bfa7-430f26f8d7d8"
instead of hb_pb
and hb_uuid
. With this keys we filter out such bids in video endpoint as "loosing" bids, because they don't have hb_uuid
key:
if tempRespBidExt.Prebid.Targeting[string(openrtb_ext.HbVastCacheKey)] == "" { continue }
Do you mean collisions in hb_pb_cat_dur
keys? We run applyCategoryMapping
function and deduplicate bids before auction and caching logic.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahh, yes, the keys are segregated by bids initially, which keeps them separate. Still it seems inelegant to push the isVideo
flag down from the top in order to change the targeting key generation behavior. We should probably consider putting the flag somewhere in the request itself, the reason for the original behavior (only counting the "winning" bids) was that as the number of bidders increased, the demands on cache increased. Our initial estimates were that the cost of caching so many video bids would render the prebid video service uneconomical. So if we have a flag inside the request, we can have it on for requests that only invoke 2-3 bidders, and then turn it off for the larger requests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, filtering out the "loosing" bids is exactly what this PR is trying to turn off, but by adding a workaround rather than removing the filtering. So that seems another count against this approach.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am also not really happy with this flag. We need to distinguish video and non-video requests in exchange somehow and we deal with openRtb Bid request only, so we cannot add any new flags in it.
We can add this flag in targeting data only, do you want me to process with this?
Cache usage will increase in this case, true. For bidders limiting - do we want to have max number of bidders in config?
In case we have bids back from 5 bidders. Should we select 3 bids per imp with better prices and filter out other 2 bids?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update. Me and @jmaynardxandr just had a discussion on this and here are some points we came up with.
- Modify "applyCategoryMapping", deduplication part of it:
-sort bids by pod
-for the same pod deduplicate bids by category only, not byhb_pb_cat_dur
-instead of removing random bid, find bid with better price (or price per second) - Implement "altering auction" where we will return only 3 (max bids per imp, value from config) bids per imp and they all will be winning. In case we have 5 bids back per impression - we will return best 3 of them (by price only or by price per second).
If we have new "alteringAuction" function, we can rely on labels and don't create any new ugly variables:
if labels.RType == pbsmetrics.ReqTypeVideo{ alteringAuction }else{ auc = newAuction(adapterBids, len(bidRequest.Imp)) }
@hhhjort does this make sense to you?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we want to configure the number of bids we want to cache and return, likely in the PBS config rather than the request. We don't know what the proper balance will be between number of bids and cost ... each additional bid gives us a diminishing return, and at some point the cost in caching will outweigh the benefit gained from the extra bid. In addition, this balance point could easily be different for different PBS hosts.
Perhaps a config for the maximum amount of bidding/caching the host wants to do, and an in request parameter from the publisher for the level they would like to see. That would allow for publishers to tune the requests per the performance they see, but not let them dial it up past the level the host is willing to support.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@hhhjort Agreed, Eventually we would like to find the right balance between bid caching and the value of choosing it, however in the phase where demand and inventory are just ramping up, this limit would act as a negative. Later, we can make a decision based on data (win rates based on various factors such as category and creative duration, etc) to figure out what the number would look. For now, the idea is to limit 1 winning bid per imp for each bidder/adapter. Ideally we need this only for long form video. Other inventory types may not need this. Please suggest, if and how we can apply this to only long-form video.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The straightforward way would be to have the video endpoint request includebidderkeys
in the request it passes to the openrtb code. Then it can read out the bidder specific versions of the targeting keys (such as hb_pb_<bidder>
in place of hb_pb
). Another option is to not have the openrtb susbsystem generate the keys, but instead have the video endpoint read the bids and execute its own code to extract the key values. This might be best if you suspect that video needs will diverge significantly from what the openrtb endpoint does.
35ddd9c
to
c89fc29
Compare
c89fc29
to
f1582a4
Compare
@hhhjort Ready for review |
endpoints/openrtb2/video_auction.go
Outdated
@@ -626,6 +634,7 @@ func createBidExtension(videoRequest *openrtb_ext.BidRequestVideo) ([]byte, erro | |||
IncludeWinners: true, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You may not need IncludeWinners
now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed.
@@ -7,7 +7,7 @@ import ( | |||
"github.com/prebid/prebid-server/openrtb_ext" | |||
) | |||
|
|||
const maxKeyLength = 20 | |||
const MaxKeyLength = 20 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you want, you can make this a config option rather than being hardcoded. There has been some talk about making this configurable, but no one has gotten around to it yet. Not required in this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@hhhjort I agree this should be a config option. Can we leave it for later, because this PR has first priority now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We already have a story for this internally. Not required for this PR.
Two comments on things you may change, but otherwise looking very good now. Waiting on your feedback on the two points before I approve. |
… tests modified accordingly
Thank you for your comments, @hhhjort! File: multiImpVideoNoIncludeBidderKeys.json has otherwise doCache test fails due to nil cached data because of this check: About moving |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Making the cache key length configurable can certainly wait.
@@ -7,7 +7,7 @@ import ( | |||
"github.com/prebid/prebid-server/openrtb_ext" | |||
) | |||
|
|||
const maxKeyLength = 20 | |||
const MaxKeyLength = 20 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We already have a story for this internally. Not required for this PR.
…ter-video * origin/master: (26 commits) Fix bid dedup (prebid#1456) moving docs to website repo (prebid#1443) Fixing comment for usage of deal priority field (prebid#1451) Fix minor error message spelling mistake "vastml" -> "vastxml" (prebid#1455) Adform adapter: additional targeting params added (prebid#1424) Added adpod_id to request extension (prebid#1444) Fixes bug (prebid#1448) Validate External Cache Host (prebid#1422) Enable geo activation of GDPR flag (prebid#1427) Update the fallback GVL to last version (prebid#1440) Fix no bid debug log (prebid#1375) Eplanning adapter: Get domain from page (prebid#1434) Fix TCF1 Fetcher Fallback (prebid#1438) Refactor rate converter separating scheduler from converter logic to improve testability (prebid#1394) [WIP] Bid deduplication enhancement (prebid#1430) Video endpoint bid selection enhancements (prebid#1419) update to the latest go-gdpr release (prebid#1436) Default TCF1 GVL in anticipation of IAB no longer hosting the v1 GVL (prebid#1433) Tcf2 id support (prebid#1420) Remove redundad struct (prebid#1432) ...
Video endpoint bid selection enhancement: return all winning bids from all bidders per impression instead of one overall winning bid