You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To my (very inexperienced) eye - it looks like the source of the bug is when da.map_blocks tries to apply over the allele_mapping, and the variant_allele arrays. variant_allele is a chunked dask array, whereas allele_mapping is an in-memory numpy array. When map_blocks is run, it is passed the chunks of the variant_allele, and the entire allele_mapping array, so the test for size...
...fails, as it is comparing the size of a chunk of variant_allele to the entire allele_mapping array.
I've attempted to fix this in the my PR by chunking allele_mapping according to variant_allele (see here). Now it seems to be working OK.
Just to update, I rolled the fix for this into other work I was doing on improving the biallelic SNP calls functions, via #623. Thanks again for figuring it out!
When I try to retrieve 'variant_allele' data from the results of
biallelic_snp_calls()
, where I have specifiedn_snps()
, I get an 'AssertionError'.snp_calls()
works fine:Returns:
biallelic_snps_calls()
also works fine:Returns:
biallelic_snp_calls()
where I have specifiedn_snps()
...Returns:
This looks like some kind of mismatch in the expected size of arrays in the
apply_allele_mapping()
The text was updated successfully, but these errors were encountered: