Skip to content

Commit

Permalink
Master python3 manualmerge 3 (#3)
Browse files Browse the repository at this point in the history
* parent 23707a1 (dmwm#6818)

Initial changes for python3. Make it possible to run with python3 on sched.

* use gocurl from CVMFS Fix dmwm#6822 (dmwm#6824)

* Belforte patch 1 (dmwm#6825)

* use gocurl from CVMFS Fix dmwm#6822 (dmwm#6823)

* add comment about py2/3 compatibility needs

* use status_cache in pickle format/. Fix dmwm#6820 (dmwm#6829)

* Remove most old "Panda" code (dmwm#6835)

* remove PandaServerInterface. for dmwm#6542

* remove unused taskbuffer. For dmwm#6542

* remove useless comment about Panda. For dmwm#6542

* remove PanDAExceptions. For dmwm#6542

* disallow panda scheduler in regexp. for dmwm#6542

* Remove old crab cache code (dmwm#6833)

* remove code in UserFileCache. for dmwm#6776

* remove reference to UserFileCache in setup.py. For dmwm#6776

* remove all code references to UserFileCache. For dmwm#6776

* remove all calls to panda stuff in the code (dmwm#6836)

* remove pada fields. For dmwm#6542

* remove references to pandajobid DB column in code. For dmwm#6542

* remove panda-related JobGroup. For dmwm#6542

* remove useless calls to JobGroup. For dmwm#6542

* remove all references in code to panda, jobset and jobgroups. For dmwm#6542

* Move away mysql fix 6837 (dmwm#6838)

* add a place for obsolete code

* move MYSQL code to obsolete dir. Fix dmwm#6837

* remove Databases/TaskDB/Oracle/JobGroup from build. Fix dmwm#6839 (dmwm#6840)

* use urllib3 in place of urllib2 (dmwm#6841)

* remove couchDb related code. Easy part for dmwm#6834 (dmwm#6842)

* Proper fix for autom split (dmwm#6843)

* py3 fix for hashblib

* proper py3 porting of urllib2.urlopen

* remove old code. For dmwm#6845 (dmwm#6847)

* Remove couch db code (dmwm#6848)

* remove couchDb related code. Easy part for dmwm#6834

* remove CouchDB code from DagmanResubmitter. For dmwm#6845

* remove CouchDB code from PostJob. For dmwm#6845

* remove isCouchDBURL, now unused. For dmwm#6845

* one more cleanup in PostJob. For dmwm#6845

* one more cleanup in PostJob. For dmwm#6845

* restore code deleted by mistake

* [py3] src/python/Databases suports py2 and py3 (dmwm#6828)

* scr/pytohn/CRABInterface supports py3 (dmwm#6831)

* [py3] src/python/CRABInterface - changes suggested by futurize

* removed uses of deprecated panda code

* validate_str instead of validate_ustr, deprecated in WMCore

* a hack to make it run for minimal purposes (dmwm#6850)

* complete removal of unused taskbuffer

* stop trying to remove failed migrations from 2019. Fix dmwm#6854 (dmwm#6856)

* Port to python3 recent small fixes from master (dmwm#6858)

* use gocurl from CVMFS Fix dmwm#6822 (dmwm#6823)

* add comment about py2/3 compatibility needs (dmwm#6826)

* add GH remote for Diego

* upload new config version (dmwm#6852)

* stop trying to remove failed migrations from 2019. Fix dmwm#6854 (dmwm#6855)

Co-authored-by: Daina <[email protected]>

* better logging of acquired publication files. Fix dmwm#6860 (dmwm#6861)

* remove unused/undef variable. fix dmwm#6864 (dmwm#6865)

* Second batch of fixes for crabserver REST in py3. (dmwm#6873)

* HTCondorWorkflow: decode to str before parsing

* HTCondorWorkflow: convert to str output of literal eval

* slight improve to stefano's `horrible hack`

* updated version of wmcore to 1.5.5 in requirements.txt

* Add more logging (dmwm#6877)

* add logging of tmp file removal

* avoid duplicating ids. Fix dmwm#6800

* get task (DAG) status form sched. Fix dmwm#6869 (dmwm#6874)

* get task (DAG) status form sched. Fix dmwm#6869

* improve comments

* rename cache_status_jel to cache_status and use it. Fix dmwm#6411 (dmwm#6878)

* validate both temp and final output LFNs. Fix dmwm#6871 (dmwm#6879)

* change back to use py3 for cache_status

last commit had changed by mistake to use python2 for cache_status

* make migration dbg Utils worn in container. Fix dmwm#6853 (dmwm#6886)

* Py3 for publisher (dmwm#6887)

* ensure tasks is a list

* basestring -> string

* no need to cast to unicode

* use python3 to start TaskPublish

* REST and TW - correctly encode/decode input/outputs of b64encode/b64decode

* stop inserting nose in TW tarball. Fix dmwm#6455 (dmwm#6888)

* stop inserting nose in TW tarball. Fix dmwm#6455

* make sure CRAB3.zip exists, improve comments

* improve log

* port to python3 branch of  dmwm@87ada3b

* port to python3 branch of dmwm@9a72d9e

* Make new publisher default (dmwm#6892)

* make NEW_PUBLISHER the default, fix dmwm#6412

* remove code swithing NEW_PUBLISHER. Fix dmwm#6410

* add comments

* start Publisher in py3 env (dmwm#6894)

* stupid typo

* py3 crabserver compatible with tasks submitted by py2 crabserver (dmwm#6907)

- tm_split_args: convert to unicode the values in the lists: 'lumis' and 'runs'wq!

* crabserver py3 - change tag for build with jenkins (dmwm#6908)

* Make tw work in py3 for dmwm#6899 (dmwm#6901)

* Queue is now lowercase, xrange -> range

* use python3 to start TW

* start TW from python3.8 dir

* workaround ldap currently missing in py3 build

* basestring --> str

* use binary files for pickle

* make sure to hande classAd defined as bytes as well

* remove MonALISA code. Fix dmwm#6911 (dmwm#6913)

* TW - new tag of WMCore with fix to credential/proxy (dmwm#6915)

* TW - remove Logger and ProcInfo from setup.py and from bin/htcondor_make_runtime.sh (dmwm#6916)

* TW - remove Logger and ProcInfo from setup.py

* TW - remove Logger and ProcInfo from bin/htcondor_make_runtime.sh

* TW - remove apmon from setup.py

* TW - update tag of WMCore to mapellidario/py3.211214patch1

* setup.py - remove RESTInteractions from CRABClient build (dmwm#6919)

* generate Error on bad extconfig format, remove old code, cleanup. Fix dmwm#6897 See also dmwm#6897 (comment) (dmwm#6910)

* better py3 comp. for authenticatedSubprocess. fix dmwm#6899 (comment) (dmwm#6927)

* remove references to asourl/asodb in TW (dmwm#6929)

* [py3] apply py3-modernization changes to whole dmwm/CRABServer (dmwm#6921)

* [py3] migrated TW/Actions/ to py3

* [py3] fix open() mode: str for json, bytes for pickle

* [py3] fix use of hashlib.sha1(): input must be bytes

* TaskWorker/Actions/StageoutCheck: use execute_command, not executeCommand

* Publish utils for py3 (dmwm#6941)

* use python3 to run DebugFailedBlockPublication

* use python3 to run FindFailedBlockPublication

* make py3 compat and improve printout. Fix dmwm#6939

* optionally create new publication

* Fix task publish 6940 (dmwm#6942)

* avoid using undefined variable. Fix dmwm#6940

* make sure all calls to DBS are in try/excect for dmwm#6940

* use Rucio client py2 for FTS_transfer.py. Fix dmwm#6948 (dmwm#6949)

* use Rucio client py2 for FTS_transfer.py. Fix dmwm#6948

* add comment about python version

* pass $XrdSecGSISRVNAMES to cmsRun. Fix dmwm#6953 (dmwm#6955) (dmwm#6956)

* Pre dag divide by zero fix 6926 (dmwm#6959)

* protect against probe jobs returning no events. Fix dmwm#6926

* some pylint cleanups

* Cleanup userproxy from rest fix 6931 (dmwm#6960)

* remove unused retrieveUserCert for dmwm#6931

* cleanup unused userproxy from REST fix dmwm#6931

* remove unused imports

* cleanup serverdn/serverproxy/serverkey from REST code. Fix dmwm#6961

* correct kill arguments. Fix dmwm#6928 (dmwm#6964)

* requirements.txt: update wmcore tag (dmwm#6966)

* REST-py3 backward compatibile with publisher-py2 (dmwm#6967)

* Fix mkruntime 6970 (dmwm#6971)

* non need for cherrypy in TW tarball. Fix dmwm#6970

* place dummyFile in local dir and clenaup

* remove useless encode. fix dmwm#6972 (dmwm#6973)

* use $STARTDIR for dummyFile. (dmwm#6974)

* enable TaskWorker to use IDTOKENS. Fix dmwm#6903 (dmwm#6975)

* update requirements.txt to dmwm/WMCore 1.5.7 (dmwm#6982)

* use different WEB_DIR for token auth. Fix dmwm#6905 (dmwm#6983)

* correct check for classAd existance. Fix dmwm#6986 (dmwm#6987)

* define CRAB_UserHN ad for task_wrapper. Fix dmwm#6981 (dmwm#6988)

* no spaces around = in bash. properly fix dmwm#6981

* fix not py3-compatible pycurl.error handling in RESTInteractions (dmwm#6996)

* make Pre/Post/RetryJob use existing WEB_DIR. Fix dmwm#6994 (dmwm#6998)

* remove extra / in API name. Fix dmwm#7004 (dmwm#7005)

* Remove extra slash fix 7004 (dmwm#7006)

* remove extra / in API name. Fix dmwm#7004

* remove extra / in API name. Fix dmwm#7004

* restore NoAvailableStie exception for TW. Fix dmwm#7038 (dmwm#7039)

* make sure classAds for matching are ORDERED lists, fix dmwm#7043 (dmwm#7044)

* make sure eventsThr and eventsSize are not used if not initialized. Fix dmwm#7065 (dmwm#7066)

* Adjust code to work with new DBS Go based server (dmwm#6969) (dmwm#7074)

Co-authored-by: Valentin Kuznetsov <[email protected]>

* user python3 for FTS_transfers. Fix dmwm#6909 (dmwm#7052)

* adapt to new DBS serverinfo API (dmwm#7093)

* use WMCore 2.0.1.pre3 - Fix dmwm#7096 (dmwm#7097)

* point user feedback to CmsTalk. Fix dmwm#7100 (dmwm#7101)

Co-authored-by: Stefano Belforte <[email protected]>
Co-authored-by: Daina <[email protected]>
Co-authored-by: Valentin Kuznetsov <[email protected]>
  • Loading branch information
4 people authored Feb 28, 2022
1 parent 3ac0a3a commit cd719d7
Show file tree
Hide file tree
Showing 104 changed files with 1,220 additions and 4,119 deletions.
36 changes: 11 additions & 25 deletions bin/htcondor_make_runtime.sh
Original file line number Diff line number Diff line change
Expand Up @@ -34,19 +34,20 @@ else
fi
pushd $STARTDIR

#
# cleanup, avoid to keep adding to existing tarballs
#

# cleanup, avoid to keep adding to existing tarballs
rm -f $STARTDIR/CRAB3.zip
rm -f $STARTDIR/WMCore.zip
rm -f $STARTDIR/nose.tar.gz

# make sure there's always a CRAB3.zip to avoid errors in other parts
touch $STARTDIR/dummyFile
zip -r $STARTDIR/CRAB3.zip $STARTDIR/dummyFile
rm -f $STARTDIR/dummyFile

# For developers, we download all our dependencies from the various upstream servers.
# For actual releases, we take the libraries from the build environment RPMs.
if [[ "x$RPM_RELEASE" != "x" ]]; then

# I am inside a release building
pushd $ORIGDIR/../WMCore-$WMCOREVER/build/lib/
zip -r $STARTDIR/WMCore.zip *
zip -rq $STARTDIR/CRAB3.zip WMCore PSetTweaks Utils -x \*.pyc || exit 3
Expand All @@ -56,16 +57,12 @@ if [[ "x$RPM_RELEASE" != "x" ]]; then
zip -rq $STARTDIR/CRAB3.zip RESTInteractions.py HTCondorUtils.py HTCondorLocator.py TaskWorker CRABInterface TransferInterface -x \*.pyc || exit 3
popd

pushd $VO_CMS_SW_DIR/$SCRAM_ARCH/external/cherrypy/*/lib/python2.7/site-packages
zip -rq $STARTDIR/CRAB3.zip cherrypy -x \*.pyc
popd

mkdir -p bin
cp -r $ORIGDIR/scripts/{TweakPSet.py,CMSRunAnalysis.py,task_process} .
cp $ORIGDIR/src/python/{Logger.py,ProcInfo.py,ServerUtilities.py,RucioUtils.py,CMSGroupMapper.py,RESTInteractions.py} .
cp $ORIGDIR/src/python/{ServerUtilities.py,RucioUtils.py,CMSGroupMapper.py,RESTInteractions.py} .

else

# building runtime tarballs from development area or GH
if [[ -d "$REPLACEMENT_ABSOLUTE/WMCore" ]]; then
echo "Using replacement WMCore source at $REPLACEMENT_ABSOLUTE/WMCore"
WMCORE_PATH="$REPLACEMENT_ABSOLUTE/WMCore"
Expand All @@ -85,17 +82,6 @@ else
CRABSERVER_PATH="CRABServer-$CRABSERVERVER"
fi

if [[ ! -e nose.tar.gz ]]; then
curl -L https://github.com/nose-devs/nose/archive/release_1.3.0.tar.gz > nose.tar.gz || exit 2
fi

tar xzf nose.tar.gz || exit 2

pushd nose-release_1.3.0/
zip -rq $STARTDIR/CRAB3.zip nose -x \*.pyc || exit 3
popd


# up until this point, evertying in CRAB3.zip is an external
cp $STARTDIR/CRAB3.zip $ORIGDIR/CRAB3-externals.zip

Expand All @@ -110,12 +96,12 @@ else

mkdir -p bin
cp -r $CRABSERVER_PATH/scripts/{TweakPSet.py,CMSRunAnalysis.py,task_process} .
cp $CRABSERVER_PATH/src/python/{Logger.py,ProcInfo.py,ServerUtilities.py,RucioUtils.py,CMSGroupMapper.py,RESTInteractions.py} .
cp $CRABSERVER_PATH/src/python/{ServerUtilities.py,RucioUtils.py,CMSGroupMapper.py,RESTInteractions.py} .
fi

pwd
echo "Making TaskManagerRun tarball"
tar zcf $ORIGDIR/TaskManagerRun-$CRAB3_VERSION.tar.gz CRAB3.zip TweakPSet.py CMSRunAnalysis.py task_process Logger.py ProcInfo.py ServerUtilities.py RucioUtils.py CMSGroupMapper.py RESTInteractions.py || exit 4
tar zcf $ORIGDIR/TaskManagerRun-$CRAB3_VERSION.tar.gz CRAB3.zip TweakPSet.py CMSRunAnalysis.py task_process ServerUtilities.py RucioUtils.py CMSGroupMapper.py RESTInteractions.py || exit 4
echo "Making CMSRunAnalysis tarball"
tar zcf $ORIGDIR/CMSRunAnalysis-$CRAB3_VERSION.tar.gz WMCore.zip TweakPSet.py CMSRunAnalysis.py Logger.py ProcInfo.py ServerUtilities.py CMSGroupMapper.py RESTInteractions.py || exit 4
tar zcf $ORIGDIR/CMSRunAnalysis-$CRAB3_VERSION.tar.gz WMCore.zip TweakPSet.py CMSRunAnalysis.py ServerUtilities.py CMSGroupMapper.py RESTInteractions.py || exit 4
popd
2 changes: 1 addition & 1 deletion bin/logon_myproxy_openssl.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,5 +18,5 @@
'server_cert': sys.argv[3],}
timeleftthreshold = 60 * 60 * 24
mypclient = SimpleMyProxy(defaultDelegation)
userproxy = mypclient.logonRenewMyProxy(username=sha1(sys.argv[4]+userdn).hexdigest(), myproxyserver=myproxyserver, myproxyport=7512)
userproxy = mypclient.logonRenewMyProxy(username=sha1((sys.argv[4]+userdn).encode("utf8")).hexdigest(), myproxyserver=myproxyserver, myproxyport=7512)
print ("Proxy Retrieved with len ", len(userproxy))
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
"""
import threading
import string

from WMCore.Database.DBCreator import DBCreator
from Databases.FileMetaDataDB.Oracle.Create import Create
Expand All @@ -29,6 +28,6 @@ def __init__(self, logger = None, dbi = None, param=None):
i = 0
for tableName in orderedTables:
i += 1
prefix = string.zfill(i, 2)
prefix = str(i).zfill(2)
self.create[prefix + tableName] = "DROP TABLE %s" % tableName

Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
"""
import threading
import string

from WMCore.Database.DBCreator import DBCreator
from Databases.TaskDB.Oracle.Create import Create
Expand All @@ -29,5 +28,5 @@ def __init__(self, logger = None, dbi = None, param=None):
i = 0
for tableName in orderedTables:
i += 1
prefix = string.zfill(i, 2)
prefix = str(i).zfill(2)
self.create[prefix + tableName] = "DROP TABLE %s" % tableName
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
"""
import threading
import string

from WMCore.Database.DBCreator import DBCreator
from Databases.TaskDB.Oracle.Create import Create
Expand All @@ -29,5 +28,5 @@ def __init__(self, logger = None, dbi = None, param=None):
i = 0
for tableName in orderedTables:
i += 1
prefix = string.zfill(i, 2)
prefix = str(i).zfill(2)
self.create[prefix + tableName] = "DROP TABLE %s" % tableName
File renamed without changes.
3 changes: 3 additions & 0 deletions obsolete/README.MD
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
### CRAB OBSOLETE
A place where to put files which are not needed in CRABServer repo any more but may be useful as examples, history, or otherwise

3 changes: 2 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,5 @@
# Format:
# Dependency==version

wmcver==1.5.3
wmcver==2.0.1.pre3

48 changes: 26 additions & 22 deletions scripts/AdjustSites.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,10 @@
import time
import glob
import shutil
import urllib
from urllib.parse import urlencode
import traceback
from datetime import datetime
from httplib import HTTPException
from http.client import HTTPException

import classad
import htcondor
Expand Down Expand Up @@ -210,7 +210,10 @@ def makeWebDir(ad):
"""
Need a doc string here.
"""
path = os.path.expanduser("~/%s" % ad['CRAB_ReqName'])
if 'AuthTokenId' in ad:
path = os.path.expanduser("/home/grid/%s/%s" % (ad['CRAB_UserHN'], ad['CRAB_ReqName']))
else:
path = os.path.expanduser("~/%s" % ad['CRAB_ReqName'])
try:
## Create the web directory.
os.makedirs(path)
Expand Down Expand Up @@ -238,23 +241,6 @@ def makeWebDir(ad):
os.symlink(os.path.abspath(os.path.join(".", ".job.ad")), os.path.join(path, "job_ad.txt"))
os.symlink(os.path.abspath(os.path.join(".", "task_process/status_cache.txt")), os.path.join(path, "status_cache"))
os.symlink(os.path.abspath(os.path.join(".", "task_process/status_cache.pkl")), os.path.join(path, "status_cache.pkl"))
# prepare a startup cache_info file with time info for client to have something useful to print
# in crab status while waiting for task_process to fill with actual jobs info. Do it in two ways
# new way: a pickle file for python3 compatibility
startInfo = {'bootstrapTime': {}}
startInfo['bootstrapTime']['date'] = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S UTC")
startInfo['bootstrapTime']['fromEpoch'] = int(time.time())
with open(os.path.abspath(os.path.join(".", "task_process/status_cache.pkl")), 'w') as fp:
pickle.dump(startInfo, fp)
# old way: a file with multiple lines and print-like output
startInfo = "# Task bootstrapped at " + datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S UTC") + "\n"
startInfo += "%d\n" % (int(time.time())) # machines will like seconds from Epoch more
# prepare fake status_cache info to please current (v3.210127) CRAB Client
fakeInfo = startInfo + "{"
fakeInfo += "'DagStatus': {'SubDagStatus': {}, 'Timestamp': 0L, 'NodesTotal': 1L, 'SubDags': {}, 'DagStatus': 1L}"
fakeInfo += "}\n{}\n"
with open(os.path.abspath(os.path.join(".", "task_process/status_cache.txt")), 'w') as fd:
fd.write(fakeInfo)
os.symlink(os.path.abspath(os.path.join(".", "prejob_logs/predag.0.txt")), os.path.join(path, "AutomaticSplitting_Log0.txt"))
os.symlink(os.path.abspath(os.path.join(".", "prejob_logs/predag.0.txt")), os.path.join(path, "AutomaticSplitting/DagLog0.txt"))
os.symlink(os.path.abspath(os.path.join(".", "prejob_logs/predag.1.txt")), os.path.join(path, "AutomaticSplitting/DagLog1.txt"))
Expand All @@ -266,6 +252,24 @@ def makeWebDir(ad):
except Exception as ex: #pylint: disable=broad-except
#Should we just catch OSError and IOError? Is that enough?
printLog("Failed to copy/symlink files in the user web directory: %s" % str(ex))

# prepare a startup cache_info file with time info for client to have something useful to print
# in crab status while waiting for task_process to fill with actual jobs info. Do it in two ways
# new way: a pickle file for python3 compatibility
startInfo = {'bootstrapTime': {}}
startInfo['bootstrapTime']['date'] = datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S UTC")
startInfo['bootstrapTime']['fromEpoch'] = int(time.time())
with open(os.path.abspath(os.path.join(".", "task_process/status_cache.pkl")), 'wb') as fp:
pickle.dump(startInfo, fp)
# old way: a file with multiple lines and print-like output
startInfo = "# Task bootstrapped at " + datetime.utcnow().strftime("%Y-%m-%d %H:%M:%S UTC") + "\n"
startInfo += "%d\n" % (int(time.time())) # machines will like seconds from Epoch more
# prepare fake status_cache info to please current (v3.210127) CRAB Client
fakeInfo = startInfo + "{"
fakeInfo += "'DagStatus': {'SubDagStatus': {}, 'Timestamp': 0L, 'NodesTotal': 1L, 'SubDags': {}, 'DagStatus': 1L}"
fakeInfo += "}\n{}\n"
with open(os.path.abspath(os.path.join(".", "task_process/status_cache.txt")), 'w') as fd:
fd.write(fakeInfo)
printLog("WEB_DIR created, sym links in place and status_cache initialized")

try:
Expand All @@ -287,7 +291,7 @@ def uploadWebDir(crabserver, ad):

try:
printLog("Uploading webdir %s to the REST" % data['webdirurl'])
crabserver.post(api='task', data=urllib.urlencode(data))
crabserver.post(api='task', data=urlencode(data))
return 0
except HTTPException as hte:
printLog(traceback.format_exc())
Expand All @@ -314,7 +318,7 @@ def saveProxiedWebdir(crabserver, ad):
if proxied_webDir: # Prefer the proxied webDir to the non-proxied one
ad[webDir_adName] = str(proxied_webDir)

if ad[webDir_adName]:
if webDir_adName in ad:
# This condor_edit is required because in the REST interface we look for the webdir if the DB upload failed (or in general if we use the "old logic")
# See https://github.com/dmwm/CRABServer/blob/3.3.1507.rc8/src/python/CRABInterface/HTCondorDataWorkflow.py#L398
dagJobId = '%d.%d' % (ad['ClusterId'], ad['ProcId'])
Expand Down
4 changes: 2 additions & 2 deletions scripts/CMSRunAnalysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -745,7 +745,7 @@ def StripReport(report):
print("== Execution site from site-local-config.xml: %s" % slCfg.siteName)
with open('jobReport.json', 'w') as of:
json.dump(rep, of)
with open('jobReportExtract.pickle', 'w') as of:
with open('jobReportExtract.pickle', 'wb') as of:
pickle.dump(rep, of)
print("==== Report file creation FINISHED at %s ====" % time.asctime(time.gmtime()))
except FwkJobReportException as FJRex:
Expand All @@ -764,7 +764,7 @@ def StripReport(report):
try:
oldName = 'UNKNOWN'
newName = 'UNKNOWN'
for oldName, newName in literal_eval(options.outFiles).iteritems():
for oldName, newName in literal_eval(options.outFiles).items():
os.rename(oldName, newName)
except Exception as ex:
handleException("FAILED", EC_MoveOutErr, "Exception while moving file %s to %s." %(oldName, newName))
Expand Down
2 changes: 1 addition & 1 deletion scripts/Utils/DebugFailedBlockPublication.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# coding: utf-8
from __future__ import division
from __future__ import print_function
Expand Down
16 changes: 12 additions & 4 deletions scripts/Utils/FindFailedMigrations.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# coding: utf-8
from __future__ import print_function
from __future__ import division
Expand All @@ -7,8 +7,6 @@
from datetime import datetime
import argparse

# this is needed to make it possible for the following import to work
import CRABClient #pylint: disable=unused-import
from dbs.apis.dbsClient import DbsApi


Expand Down Expand Up @@ -60,10 +58,20 @@ def readAndParse(csvFile, apiMig):
def main():
parser = argparse.ArgumentParser()
parser.add_argument('--file', help='log file of terminally failed migrations in CSV format',
default='TerminallyFailedLog.txt')
default='/data/srv/Publisher/logs/migrations/TerminallyFailedLog.txt')
args = parser.parse_args()
logFile = os.path.abspath(args.file)

# if X509 vars are not defined, use default Publisher location
userProxy = os.getenv('X509_USER_PROXY')
if userProxy:
os.environ['X509_USER_CERT'] = userProxy
os.environ['X509_USER_KEY'] = userProxy
if not os.getenv('X509_USER_CERT'):
os.environ['X509_USER_CERT'] = '/data/certs/servicecert.pem'
if not os.getenv('X509_USER_KEY'):
os.environ['X509_USER_KEY'] = '/data/certs/servicekey.pem'

migUrl = 'https://cmsweb-prod.cern.ch/dbs/prod/phys03/DBSMigrate'
apiMig = DbsApi(url=migUrl)

Expand Down
50 changes: 29 additions & 21 deletions scripts/Utils/RemoveFailedMigration.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,12 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# coding: utf-8
from __future__ import print_function
from __future__ import division

import os
from datetime import datetime
import argparse

# this is needed to make it possible for the following import to work
import CRABClient #pylint: disable=unused-import
from dbs.apis.dbsClient import DbsApi

def main():
Expand All @@ -16,6 +15,16 @@ def main():
args = parser.parse_args()
migrationId = int(args.id)

# if X509 vars are not defined, use default Publisher location
userProxy = os.getenv('X509_USER_PROXY')
if userProxy:
os.environ['X509_USER_CERT'] = userProxy
os.environ['X509_USER_KEY'] = userProxy
if not os.getenv('X509_USER_CERT'):
os.environ['X509_USER_CERT'] = '/data/certs/servicecert.pem'
if not os.getenv('X509_USER_KEY'):
os.environ['X509_USER_KEY'] = '/data/certs/servicekey.pem'

migUrl = 'https://cmsweb-prod.cern.ch/dbs/prod/phys03/DBSMigrate'
apiMig = DbsApi(url=migUrl)

Expand All @@ -39,10 +48,10 @@ def main():
print("migrationId: %d was created on %s by %s for block:" % (migrationId, created, creator))
print(" %s" % block)

answer = raw_input("Do you want to remove it ? Yes/[No]: ")
answer = input("Do you want to remove it ? Yes/[No]: ")
if answer in ['Yes', 'YES', 'Y', 'y', 'yes']:
answer = 'Yes'
if answer is not 'Yes':
if answer != 'Yes':
return

print("\nRemoving it...")
Expand All @@ -52,22 +61,21 @@ def main():
print("Migration removal failed with this exception:\n%s" % str(ex))
return
print("Migration %d successfully removed\n" % migrationId)
print("CRAB Publisher will issue such a migration request again as/when needed")
print("but if you want to recreated it now, you can do it with this python fragment")
print("\n ===============\n")
print("import CRABClient")
print("from dbs.apis.dbsClient import DbsApi")
print("globUrl='https://cmsweb-prod.cern.ch/dbs/prod/global/DBSReader'")
print("migUrl='https://cmsweb-prod.cern.ch/dbs/prod/phys03/DBSMigrate'")
print("apiMig = DbsApi(url=migUrl)")
print("block='%s'" % block)
print("data= {'migration_url': globUrl, 'migration_input': block}")
print("result = apiMig.submitMigration(data)")
print("newId = result.get('migration_details', {}).get('migration_request_id')")
print("print('new migration created: %d' % newId)")
print("status = apiMig.statusMigration(migration_rqst_id=newId)")
print("print(status)")
print("\n ===============\n")
print("CRAB Publisher will issue such a migration request again as/when needed.")
print("But if you want to re-create it now, you can by answering yes here")
answer = input("Do you want to re-create the migration request ? Yes/[No]: ")
if answer in ['Yes', 'YES', 'Y', 'y', 'yes']:
answer = 'Yes'
if answer != 'Yes':
return
print("\nSubmitting new migration request...")
globUrl = 'https://cmsweb-prod.cern.ch/dbs/prod/global/DBSReader'
data = {'migration_url': globUrl, 'migration_input': block}
result = apiMig.submitMigration(data)
newId = result.get('migration_details', {}).get('migration_request_id')
print('new migration created: %d' % newId)
status = apiMig.statusMigration(migration_rqst_id=newId)
print(status)
return

if __name__ == '__main__':
Expand Down
Loading

0 comments on commit cd719d7

Please sign in to comment.