You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Fastavro is pinned in at least one of the cloud function requirements files. Here is a test that demonstrates the need for the pin.
importfastavroasfa# test alert stored with the repofalert="tests/test_alerts/ztf_3.3_1154308030015010004.avro"# standard fastavro call to deserialize into a list of dictswithopen(falert, "rb") asfin:
list_of_dicts=list(fa.reader(fin))
Wtih fastavro==1.4.4, this works. With fastavro==1.7.3, this fails with TypeError: float() argument must be a string or a number, not 'NoneType'.
This is very likely a ZTF-specific problem, due to the non-standard ordering of their Avro schema types (the same problem that causes us to do a "fix schema" step before storing the Avros in the bucket).
Would something like tests/test_ztf_fastavro.py be a good place?
This might set a broader trend of having survey-specific tests "tests/test__*.py". The number of surveys we're going to support will be finite (<10) so I think it's fine to have tests just right there. And then it is important that we don't break things when updating for just one survey; and to identify cases where surveys might have incompatible requirements (great sadness).
I believe this is the same issue raised in fastavro/fastavro#676. The issue is marked as resolved. I only skimmed the info, but seems like we should be able to use newer fastavro versions if we pass in the schema to the reader, which we don't currently do for ZTF (but do for elasticc because we have to, since those avro packets are schemaless).
Fastavro is pinned in at least one of the cloud function requirements files. Here is a test that demonstrates the need for the pin.
Wtih
fastavro==1.4.4
, this works. Withfastavro==1.7.3
, this fails withTypeError: float() argument must be a string or a number, not 'NoneType'
.This is very likely a ZTF-specific problem, due to the non-standard ordering of their Avro schema types (the same problem that causes us to do a "fix schema" step before storing the Avros in the bucket).
Originally posted by @troyraen in #187 (comment)
The text was updated successfully, but these errors were encountered: