TransWikia.com

ArcPy "Cannot open table for Load" error

Geographic Information Systems Asked by Mac Maclean on September 30, 2021

I have been using some slightly modified code based on socalgis.org to extra data from map services. I have had no problems until this week where I am now getting the error

RuntimeError – Cannot open table for Load

I am getting this on services that had worked fine in the past. For example the following URL works fine with no error whilst this fails with “Cannot open table for Load” error. Both are “esriGeometryPolygon” with a small number of feature on the same server.

Record extract limit: 1000
Number of target records: 403
Gathering records...
  OBJECTID >= 1 and OBJECTID <= 403
  Query: https://services.gis.ca.gov/arcgis/rest/services/Environment/Weather_stations/MapServer/2/query?where=OBJECTID >= 1 and OBJECTID <= 403&returnGeometry=true&outFields=*&f=json


Traceback (most recent call last):

  File "<ipython-input-39-454949d8d113>", line 1, in <module>
    runfile('[Path to code]', wdir='[Path to code]')

  File "C:UsersUserAppDataLocalESRIcondaenvsarcgispro-py3-clone11libsite-packagesspyder_kernelscustomizespydercustomize.py", line 827, in runfile
    execfile(filename, namespace)

  File "C:UsersUserAppDataLocalESRIcondaenvsarcgispro-py3-clone11libsite-packagesspyder_kernelscustomizespydercustomize.py", line 110, in execfile
    exec(compile(f.read(), filename, 'exec'), namespace)

  File "[Path to code]", line 50, in <module>
    fs[i].load(urlstring)

  File "C:Program FilesArcGISProResourcesArcPyarcpyarcobjectsarcobjects.py", line 421, in load
    return convertArcObjectToPythonObject(self._arc_object.Load(*gp_fixargs(args)))

RuntimeError: RecordSetObject: Cannot open table for Load

The error seems to resolve around fs[i].load(urlstring). My code below:

import arcpy
import urllib
import json

arcpy.env.overwriteOutput = True
baseURL = "[URL to map service]"
fields = "*"
outdata = "[pathtogdb]"

# Get record extract limit
urlstring = baseURL + "?f=json"
j = urllib.request.urlopen(urlstring)
js = json.load(j)
maxrc = int(js["maxRecordCount"])
print (("Record extract limit: %s" % maxrc))

# Get object ids of features
where = "1=1"
urlstring = baseURL + "/query?where={}&returnIdsOnly=true&f=json".format(where)
j = urllib.request.urlopen(urlstring)
js = json.load(j)
idfield = js["objectIdFieldName"]
idlist = js["objectIds"]
idlist.sort()
numrec = len(idlist)
print (("Number of target records: %s" % numrec))

# Gather features
print (("Gathering records..."))
fs = dict()
for i in range(0, numrec, maxrc):
  torec = i + (maxrc - 1)
  if torec > numrec:
    torec = numrec - 1
  fromid = idlist[i]
  toid = idlist[torec]
  where = "{} >= {} and {} <= {}".format(idfield, fromid, idfield, toid)
  print (("  {}".format(where)))
  urlstring = baseURL + "/query?where={}&returnGeometry=true&outFields={}&f=json".format(where,fields)
  print (("  Query: %s" %urlstring))
  fs[i] = arcpy.FeatureSet()
  fs[i].load(urlstring)

# Save features
print (("Saving features..."))
fslist = []
for key,value in fs.items():
  fslist.append(value)
arcpy.Merge_management(fslist, outdata)
print (("Done!"))

I am using Python 3.6 and Spyder installed in a ArcGIS Pro environment. How can I resolve this?

UPDATE
The service with the error seems to be caused by an incorrect query – the above example being generated by URLSTRING

https://services.gis.ca.gov/arcgis/rest/services/Environment/Weather_stations/MapServer/2/query?where=OBJECTID >= 1 and OBJECTID <= 403&returnGeometry=true&outFields=*&f=json

Which gives

{"error":{"code":400,"message":"Failed to execute query.","details":[]}}

One Answer

As per @KHibma comments above I modified the maxRecordCount to a much lower number and it seems to have solved the problem

maxrc = 100 #int(js["maxRecordCount"])

I guess I need to find a way to determine the size (number of vertices) of each record and dynamically update the maxRecordCount

Answered by Mac Maclean on September 30, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP