Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issues with using the butler #429

Open
SimonKrughoff opened this issue Jan 6, 2017 · 1 comment
Open

Fix issues with using the butler #429

SimonKrughoff opened this issue Jan 6, 2017 · 1 comment

Comments

@SimonKrughoff
Copy link
Contributor

It's pretty important to the project that Twinkles use stack mechanisms to access data. If these are being subverted for any reason, we should figure out why and fix them.

@jchiang87 can you please give some examples of where you've needed to work around the stack functionality so we can fix the problems you encountered?

@jchiang87
Copy link
Contributor

jchiang87 commented Jan 6, 2017

I'm trying to access forced photometry results from the Twinkles Level 2 data, but I'm getting an error. Using this code:

import lsst.daf.persistence as df

repo = 'output'
visit = 230
raft = '2,2'
sensor = '1,1'

butler = df.Butler(repo)
dataId = dict(visit=visit, raft=raft, sensor=sensor)
calexp = butler.get('calexp', dataId=dataId)
print "filter:", calexp.getFilter().getName()
print "fluxmag0:", calexp.getCalib().getFluxMag0()
src = butler.get('src', visit=visit)

forced = butler.get('forced_src', visit=visit)

I am getting this output:

698 [0x7f1421198700] DEBUG daf.persistence.LogicalLocation null - Input string: output
698 [0x7f1421198700] DEBUG daf.persistence.LogicalLocation null - Copy to: output
699 [0x7f1421198700] INFO CameraMapper null - Loading registry registry from output/_parent/registry.sqlite3
1787 [0x7f1421198700] DEBUG daf.persistence.butler null - Get type=calexp keys=DataId(initialdata={'raft': '2,2', 'sensor': '1,1', 'visit': 230}, tag=set([])) from lsst.afw.image.ExposureF at FitsStorage(output/calexp/v230-fr/R22/S11.fits)
filter:1788 [0x7f1421198700] DEBUG daf.persistence.butler null - Starting read from lsst.afw.image.ExposureF at FitsStorage(output/calexp/v230-fr/R22/S11.fits)
1788 [0x7f1421198700] DEBUG daf.persistence.LogicalLocation null - Input string: output/calexp/v230-fr/R22/S11.fits
1788 [0x7f1421198700] DEBUG daf.persistence.LogicalLocation null - Copy to: output/calexp/v230-fr/R22/S11.fits
1788 [0x7f1421198700] DEBUG afw.ExposureFormatter null - ExposureFormatter read start
1788 [0x7f1421198700] DEBUG afw.ExposureFormatter null - ExposureFormatter read FitsStorage
1788 [0x7f1421198700] DEBUG afw.image.Mask null - Number of mask planes: 16
1868 [0x7f1421198700] DEBUG afw.ExposureFormatter null - ExposureFormatter read end
1869 [0x7f1421198700] DEBUG daf.persistence.butler null - Ending read from lsst.afw.image.ExposureF at FitsStorage(output/calexp/v230-fr/R22/S11.fits)
 r
fluxmag0: (6444973905069.26, 1329278212.46883)
1870 [0x7f1421198700] DEBUG daf.persistence.butler null - Get type=src keys=DataId(initialdata={'visit': 230}, tag=set([])) from lsst.afw.table.SourceCatalog at FitsCatalogStorage(output/src/v230-fr/R22/S11.fits)
Traceback (most recent call last):
  File "butler_test.py", line 15, in <module>
    forced = butler.get('forced_src', visit=visit)
  File "/u1/jchiang/miniconda/opt/lsst/daf_persistence/python/lsst/daf/persistence/butler.py", line 586, in get
    location = repoData.repo.map(datasetType, dataId)
  File "/u1/jchiang/miniconda/opt/lsst/daf_persistence/python/lsst/daf/persistence/repository.py", line 180, in map
    loc = self._mapper.map(*args, **kwargs)
  File "/u1/jchiang/miniconda/opt/lsst/daf_persistence/python/lsst/daf/persistence/mapper.py", line 173, in map
    return func(self.validate(dataId), write)
  File "/u1/jchiang/miniconda/opt/lsst/daf_butlerUtils/python/lsst/daf/butlerUtils/cameraMapper.py", line 294, in mapClosure
    return mapping.map(mapper, dataId, write)
  File "/u1/jchiang/miniconda/opt/lsst/daf_butlerUtils/python/lsst/daf/butlerUtils/mapping.py", line 124, in map
    actualId = self.need(iter(self.keyDict.keys()), dataId)
  File "/u1/jchiang/miniconda/opt/lsst/daf_butlerUtils/python/lsst/daf/butlerUtils/mapping.py", line 208, in need
    lookups = self.lookup(newProps, newId)
  File "/u1/jchiang/miniconda/opt/lsst/daf_butlerUtils/python/lsst/daf/butlerUtils/mapping.py", line 178, in lookup
    return self.registry.lookup(properties, self.tables, lookupDataId)
  File "/u1/jchiang/miniconda/opt/lsst/daf_persistence/python/lsst/daf/persistence/registries.py", line 324, in lookup
    c = self.conn.execute(cmd, valueList)
sqlite3.OperationalError: no such column: tract

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants